Discussion in 'Coding Help' started by louie, Jul 26, 2006.
Does anybody have any recommendations?
The O'Reilly books are usually good:
jaysus, ajax seems to be the buzzword of the moment. it it pretty cool though. louie, ive a few ajax books in pdf and chm format if you dont mind reading off a screen.
Yea that will be nice if they are also php related
I got one last night but is mostly asp.net
dont know what these are, havent really looked through em. only have 5
i'll look at them and let you know.
I can wait to get the book i have ordered. this Ajax is amazing.
There is anexample of dependent selct manu from databse.
it works like a dream.
er hasnt that been done before??? dependent select memus that is
plus what book did you order?
I got the Ajax and PHP by Cristian Darie.
any chance of showing where you got the script for that?
I don't get it.
The link above has that (the car search menu).
Also another ajax is here : http://www.funkyjamjars.ie/ajax/rss_reader/ still working on it cause i need php5 so i am trying to get to work with php < 5
also another one here: http://www.funkyjamjars.ie/carsales_dealers.htm
What I mean it what site did you get the script from? Not where you implemented it
for the menus? not a clue - really - i got a bit from there, a bit from there and put it togheter to work the way i wanted.
Some of the Ajax function are just XMLHttpRequest common function then using those you create your own funtion to work with your request.
If you guys are interested in AJAX then I strongly recommend heading over to ajaxian.com if you haven't already doen so.
Great place to learn by seeing what other people are doing and there are loads of tutorial based how-to's on the site.
what do you think of this using Ajax to get files froma different domain?
Just wondering, maybe I'm missing something but you seem to be sending the page via the initial HTTP request then sending 2 further requests via xmlhttprequest for the left_menu and main_body.
Is there some advantage to doing this over sending a new HTTP request and letting the server scrap the content and send a single page? I presume you are scrapping the content from another site?
I think AJAX is a great tool for specific uses but you have to be careful not to use xmlhttprequest for the sake of it, and be very careful that usage lowers overhead unless there is very real business logic in using it (e.g. editing in place and real-time database applications).
Nice to see some experimentation with AJAX. I hadn't seen that library before either.
Ok, you are right there a 2 xmlhttprequest made, one for the left menu and the second for the main page.
The reason is, cause both of them require some seller_id, and based on that will show on the left the category he has products for, and then in the main page we display his products only.
The website that all that is coming from will have many sellers, and I need to filter products and categories based on the id they get at the time of register, so i need the filter.
Do you have a better ideea?
I tried XML, but found it very complicated to get some features working right, when using ajax and xmlhttprequest is a lot easier and more flexible.
I am looking into getting the back button and history sorted out for those pages as well.
Don't you have that seller_id when the page is initially requested? If so just let the server do the work before sending the page and you remove the 2 additional requests?
Again, can't this all be done on the server before your parse the page?
Even when I click on a product to get more details you could just make this a normal HTTP request (better for SEO) as you are returning a full page anyhow. Maybe I'm totally getting this wrong, but it looks like you are creating extra work for yourself when the simple route (single HTTP request->server scraps required page->server parses data to your template->server sends page) means less overhead, no problems recovering from page history issues, and far more crawlable pages for your site (of course you need to try and avoid the dupe content filters).
Do the afiliate sites use rss feeds? That could be another option if so and lightweight in terms of overhead.
I could do all that without the ajax request indeed using fopen or curl to write it into the source as well, good for SEO, but i am experimenting with ajax to get the hang of it.
The pages requested will be located on our server and the reaon we are doing this is cause they contain a lot of sensitive data (database connection) and to be on the safe side I use this option.
I don't want to make too many pages to request all the data, cause it might not be worth it, and the seller server might not have all the php features that the pages will require.
I can not offer RSS feed at the moment cause i don't see the point and is very limited of what you can do with it.
We are trying to get a shopping cart system based on that feed so it will be complicated to get it done using RSS.
I made a quick page for the start without using ajax onload event to write some source into the background for SEO here:
that will probably be better, what do you think?
Sorry maybe I didn't explain what I meant too well:
1. me making a request for the page,
2. the server sending me the page,
3. then my browser (via AJAX) sending two requests for the side_bar and main_body,
4. and finally writing them to the dom via JS (well actually via innerHTML);
why not just build all that on the server at my first request and save the overhead (I am completely missing out on something here?).
If this is just for experiment then keep it up!
Separate names with a comma.