Status
Not open for further replies.

georgiecasey

New Member
jaysus, ajax seems to be the buzzword of the moment. it it pretty cool though. louie, ive a few ajax books in pdf and chm format if you dont mind reading off a screen.
 

louie

New Member
I have had done it myself before using Javascript, asp or php but it used to put a lot of code into the source page creating arrays, so now using the ajax you only get few lines. Better option.

I got the Ajax and PHP by Cristian Darie.
 

louie

New Member
for the menus? not a clue - really - i got a bit from there, a bit from there and put it togheter to work the way i wanted.

Some of the Ajax function are just XMLHttpRequest common function then using those you create your own funtion to work with your request.
 

RedCardinal

New Member
If you guys are interested in AJAX then I strongly recommend heading over to ajaxian.com if you haven't already doen so.

Great place to learn by seeing what other people are doing and there are loads of tutorial based how-to's on the site.
 

RedCardinal

New Member
what do you think of this using Ajax to get files froma different domain?

Seller#

Just wondering, maybe I'm missing something but you seem to be sending the page via the initial HTTP request then sending 2 further requests via xmlhttprequest for the left_menu and main_body.

Is there some advantage to doing this over sending a new HTTP request and letting the server scrap the content and send a single page? I presume you are scrapping the content from another site?

I think AJAX is a great tool for specific uses but you have to be careful not to use xmlhttprequest for the sake of it, and be very careful that usage lowers overhead unless there is very real business logic in using it (e.g. editing in place and real-time database applications).

Nice to see some experimentation with AJAX. I hadn't seen that library before either.

Rgds

Richard
 

louie

New Member
Ok, you are right there a 2 xmlhttprequest made, one for the left menu and the second for the main page.
The reason is, cause both of them require some seller_id, and based on that will show on the left the category he has products for, and then in the main page we display his products only.

The website that all that is coming from will have many sellers, and I need to filter products and categories based on the id they get at the time of register, so i need the filter.

The xmlhttprequest id made to a local page that will be on the seller server, the the page that has been requested (at the moment) is using curl or fopen to get the page, cause javascript doesn't allow it for security reasons.

Do you have a better ideea?
I tried XML, but found it very complicated to get some features working right, when using ajax and xmlhttprequest is a lot easier and more flexible.

I am looking into getting the back button and history sorted out for those pages as well.
 

RedCardinal

New Member
Ok, you are right there a 2 xmlhttprequest made, one for the left menu and the second for the main page.
The reason is, cause both of them require some seller_id, and based on that will show on the left the category he has products for, and then in the main page we display his products only.

Don't you have that seller_id when the page is initially requested? If so just let the server do the work before sending the page and you remove the 2 additional requests?

The website that all that is coming from will have many sellers, and I need to filter products and categories based on the id they get at the time of register, so i need the filter.

Again, can't this all be done on the server before your parse the page?

Even when I click on a product to get more details you could just make this a normal HTTP request (better for SEO) as you are returning a full page anyhow. Maybe I'm totally getting this wrong, but it looks like you are creating extra work for yourself when the simple route (single HTTP request->server scraps required page->server parses data to your template->server sends page) means less overhead, no problems recovering from page history issues, and far more crawlable pages for your site (of course you need to try and avoid the dupe content filters).

I tried XML, but found it very complicated to get some features working right, when using ajax and xmlhttprequest is a lot easier and more flexible.


Do the afiliate sites use rss feeds? That could be another option if so and lightweight in terms of overhead.

Rgds

Richard
 

louie

New Member
I could do all that without the ajax request indeed using fopen or curl to write it into the source as well, good for SEO, but i am experimenting with ajax to get the hang of it.

The pages requested will be located on our server and the reaon we are doing this is cause they contain a lot of sensitive data (database connection) and to be on the safe side I use this option.

I don't want to make too many pages to request all the data, cause it might not be worth it, and the seller server might not have all the php features that the pages will require.

I can not offer RSS feed at the moment cause i don't see the point and is very limited of what you can do with it.
We are trying to get a shopping cart system based on that feed so it will be complicated to get it done using RSS.

I made a quick page for the start without using ajax onload event to write some source into the background for SEO here:
http://www.funkyjamjars.ie/LogicsoftTec/a_seller_ajax/index_1.php#

that will probably be better, what do you think?
 

RedCardinal

New Member
Sorry maybe I didn't explain what I meant too well:

Rather than:
1. me making a request for the page,
2. the server sending me the page,
3. then my browser (via AJAX) sending two requests for the side_bar and main_body,
4. and finally writing them to the dom via JS (well actually via innerHTML);

why not just build all that on the server at my first request and save the overhead (I am completely missing out on something here?).

If this is just for experiment then keep it up! :)

Rgds

Richard.
 
Status
Not open for further replies.
Top