Hello and welcome to our community! Is this your first visit?
Register
Enjoy an ad free experience by logging in. Not a member yet? Register.
Results 1 to 5 of 5
  1. #1
    Regular Coder
    Join Date
    Feb 2005
    Posts
    400
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Possible to fopen() URLs asynchronously?

    I need to fetch and parse 5 to 10 remote files. The parsing is quick and easy, but the fetch is a problem -- 20 to 30 seconds to get each <1K text file. Needless to say, the wait is aggravating.

    Is there a way to pipeline fopen() or one its cousins? I'd love to send all the file requests simulataneously, then use callback functions to parse the responses and build my output.

  • #2
    Senior Coder
    Join Date
    Jul 2003
    Location
    My pimped-out igloo in Canadia
    Posts
    1,966
    Thanks
    36
    Thanked 0 Times in 0 Posts
    Thanks scroots

    Last edited by canadianjameson; 06-14-2005 at 10:10 PM.
    Before you criticize someone, you should walk a mile in their shoes. That way, when you criticize them, you're a mile away and you have their shoes :)

  • #3
    Senior Coder
    Join Date
    Jul 2003
    Location
    My pimped-out igloo in Canadia
    Posts
    1,966
    Thanks
    36
    Thanked 0 Times in 0 Posts
    Last edited by canadianjameson; 06-14-2005 at 10:10 PM.
    Before you criticize someone, you should walk a mile in their shoes. That way, when you criticize them, you're a mile away and you have their shoes :)

  • #4
    Senior Coder
    Join Date
    Jun 2002
    Location
    UK
    Posts
    1,137
    Thanks
    0
    Thanked 0 Times in 0 Posts
    what is your current code? and what are you doing to the include files. If you are just including them to show there content or use functions you could use include or require. By the sound of it i think you want to do something like search the contents or something?

    scroots
    Spammers next time you spam me consider the implications:
    (1) that you will be persuaded by me(in a legitimate mannor)
    (2)It is worthless to you, when i have finished

  • #5
    Regular Coder
    Join Date
    Feb 2005
    Posts
    400
    Thanks
    0
    Thanked 0 Times in 0 Posts
    The files I need to fetch/parse contain a paragraph of text and assorted tab separated values. They're used in a table of a dynamic monitoring page.

    Grabbing any one of them, exploding the contents, and splitting the parts ino the HTML of the page-in-progess is simple. The problem comes when the remote server is being slow. By the time the last file trickles in, the first file is outdated. Or the client has push the refreshed button. Or the client browser has timed-out the connection.

    The best thing would be to speed up the remote server, but thats out of my control.

    Currently, I'm using simple for loop to open/parse the files one at a time. Mostly my script sits idle waiting for the remote server. I'd like to make the requests asynchonously, with callback functions doing the parsing.

    Is it possible to do asynchronous requests in php? Any multithreading support at all?


  •  

    Posting Permissions

    • You may not post new threads
    • You may not post replies
    • You may not post attachments
    • You may not edit your posts
    •