Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 6 of 6
  1. #1
    Senior Coder Nightfire's Avatar
    Join Date
    Jun 2002
    Posts
    4,265
    Thanks
    6
    Thanked 48 Times in 48 Posts

    Saving resources and optimising the loading times

    the title gives it away roughly with what I'm going to be asking.

    Basically, I have an MySQL table in the nested set model and it holds categories and subcategories, etc for a large business directory that's being developed.

    I was thinking, instead of a query to the table on every page load (which I'm guessing may cause 'lag' if there's lots of visitors) having the database table dumped into a structured xml file. So instead of querying the db all the time, it just pulls the info from the xml file.

    Now I'm wondering if opening an xml file all the time would use less resources than querying the db.

    Is there anyway to test this sort of thing as I want to get it right now while it's being made, rather than 'hotfix' it when the site's up and running

  • #2
    New Coder
    Join Date
    Jan 2010
    Location
    Birmingham, MI
    Posts
    98
    Thanks
    3
    Thanked 9 Times in 9 Posts
    Added after response written: Sorry, I rambled a little.

    So I guess it probably really matters how big your categories \ subcatagories table is. How many rows does the table have?

    So my answer is purely speculative, and in some ways I am not even sure it matters how big the table is, but I think it will be faster to pull information from the DB. The reason being, PHP is stateless, with the exception of sessions. It has no memory of what happened on the last page. So you are either going to have to load the XML file on every page, or query the DB. You are then going to have to parse this XML file line by line, until you find the entry you are looking for. Not only is this not real efficient, but I think it will end up requiring more computing power, and more bandwidth, although the bandwidth will be limited. You are sending everything, on every page load. The database on the other hand is indexed, and can return just a single result. It can take short cuts to find the information it is looking for because it is indexed, and the software is maximized to find information quickly. Parsing an XML file is almost certainly going to be slower. Regardless of how large the table is, and the type of storage hardware the server uses, you will probably find that the database can return the result in a few thousandths of a second. There is no way you can parse XML faster. Every other process is shared between the two: making connection to the server, and the server response. The XML scenario has the disadvantage of having to download everything.

    EDIT: There is actually a flaw in this reasoning. You would not have to download the XML file every time, you can read it from the server... but you still need to find the file, open it, read it then close it. Databases work more efficiently by keeping tables open in memory (correct me if I am wrong). Querying the database will still be faster.
    Last edited by JamesOxford; 08-21-2011 at 03:23 AM. Reason: explained above

  • #3
    Senior Coder Nightfire's Avatar
    Join Date
    Jun 2002
    Posts
    4,265
    Thanks
    6
    Thanked 48 Times in 48 Posts
    Cheers for that, there's a few hundred entries in the table. I'll stick to the db for now then

  • #4
    New Coder
    Join Date
    Jan 2010
    Location
    Birmingham, MI
    Posts
    98
    Thanks
    3
    Thanked 9 Times in 9 Posts
    Cheers... With a few hundred entries I wouldn't worry about 1 query per page load. If you want to maximize load times, get rid of any unnecessary queries. Remember, that is what the database software is designed to do. Store and retrieve data...quickly.

    In my experience, it doesn't really seem to be queries that slow things down all that much. I am still trying to figure out what is slowing down one of my scripts, so this is a topic I have spent some time looking at, but it isn't the queries. I put microtime() timers on each step in the script I am looking at. Loops inside loops seem to really slow things down, and possible large, nested arrays. If anyone can offer more insight into what slows PHP down, I would certainly be interested in reading.

  • #5
    New Coder
    Join Date
    Jan 2010
    Location
    Birmingham, MI
    Posts
    98
    Thanks
    3
    Thanked 9 Times in 9 Posts
    Hey mate... is Blue Panda yours?

  • #6
    Senior Coder Nightfire's Avatar
    Join Date
    Jun 2002
    Posts
    4,265
    Thanks
    6
    Thanked 48 Times in 48 Posts
    Yeah it's mine


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •