Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 4 of 4
  1. #1
    New Coder
    Join Date
    Sep 2006
    Posts
    72
    Thanks
    3
    Thanked 0 Times in 0 Posts

    Resource efficiency - single load v's js compilation

    I tried googling this but, as is often the case, defining the question (search criteria) proved a challenge.

    So, if you have any thoughts on this I would be pleased to learn them.

    I am using jquery Isotop as a language dictionary; lots of coloured tiles with word data in each tile. Roughly 100 tiles per page (per letter: A, B, etc. some letters split: eg A-An, An-Az - otherwise too many per page). Each tile has the a repeated structure of divs and probably on-page each tile has about 15% html and 85% content. I have just started building my data and was planning on just compiling a complete html file for each letter. The first, A-An, is 40k.

    I am using $.ajax({ url: '' }); to load the file into a new tile-set and wondered about the overall efficiency of this single file load v's a js compilation from a database. I could mention here that I never graduated beyond flatfile db's - the prerogative of the aged .

  • #2
    Senior Coder rnd me's Avatar
    Join Date
    Jun 2007
    Location
    Urbana
    Posts
    4,460
    Thanks
    11
    Thanked 600 Times in 580 Posts
    with only 6kb of repetition per page, you won't see a huge gain in network or pageload performance inpact either way.

    separate pages make it easy to use back/forward to navigate, ajax might render subsequent pages slightly faster than a refresh.

    compared to 30+html files, it might be easier to maintain if you used ajax and flat files to populate a shell HTML template with a given dataset. or, you could use a cms, SSI, or even dreamweaver templates to make managing 30 pages pretty simple.

    if it were up to me, i would probably feed csv files to a JS-based parser, feed the resulting js objects to a mustache template, and feed the resulting html to isotope.
    my site (updated 2014/10/20)
    BROWSER STATS [% share] (2014/9/03) IE7:0.1, IE8:4.3, IE11:9.2, IE9:2.7, IE10:2.6, FF:16.8, CH:47.5, SF:7.8, NON-MOUSE:37%

  • #3
    Supreme Master coder! Old Pedant's Avatar
    Join Date
    Feb 2009
    Posts
    27,554
    Thanks
    80
    Thanked 4,620 Times in 4,583 Posts
    I would agree, save only that I would also look at loading XML instead of CSV, depending on what the data being processed looks like. The XML will be larger than the CSV, but it might be easier to save it and process it. Can't tell without seeing the data.

    A third option might be to store the data in a database and then use JSON. Fairly easy to convert a set of records to JSON and almost surely would be fastest client-side (not that speed is likely to be an issue here).
    An optimist sees the glass as half full.
    A pessimist sees the glass as half empty.
    A realist drinks it no matter how much there is.

  • #4
    New Coder
    Join Date
    Sep 2006
    Posts
    72
    Thanks
    3
    Thanked 0 Times in 0 Posts
    In some ways your answers are a bit of a relief - in that I have started and in that evolving the extra code, while the challenge is fun, is extra work. Mind you there is a nice example of fakeElemts.js that comes with Isotope.

    @rnd me
    Interesting stats - chrome @ 43%
    NO ie6

    Thanks to you both. All I have to do now is crunch the data


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •