exciter1 Posted December 4, 2014 Share Posted December 4, 2014 Your thread is live again: http://community.brickpicker.com/topic/12460-out-of-stock-lego-set-christmas-alerts/ Jeff, do you want me to put the link for the BrickPicker Stock Tracker in the main thread. I will removed the other references. Quote Link to comment Share on other sites More sharing options...
brickolodon Posted December 4, 2014 Share Posted December 4, 2014 I hope we will get more sets on that table..... Quote Link to comment Share on other sites More sharing options...
pcaster Posted December 4, 2014 Share Posted December 4, 2014 How about keeping the link to the product page (if it exists) even if the item is out of stock? Especially for shop.lego.com - to manually check if it's just backordered, TooS, etc. Quote Link to comment Share on other sites More sharing options...
Jeff Mack Posted December 4, 2014 Author Share Posted December 4, 2014 How about keeping the link to the product page (if it exists) even if the item is out of stock? Especially for shop.lego.com - to manually check if it's just backordered, TooS, etc. The links are now available on the OOS ones as well Quote Link to comment Share on other sites More sharing options...
10230 Posted December 4, 2014 Share Posted December 4, 2014 Running this test overnight was honestly its first stress test and we saw that we will need to raise these servers from 2GB of memory to 6GB. That is another $30/month for each server. Because I am interested in this on a technical level: Why isn't 2GB enough? Assuming the servers only do the task of obtaining the information (pinging the sellers), this should not need that amount of memory, not by far - even if they would cache the results for some time. What am I missing? Quote Link to comment Share on other sites More sharing options...
pcaster Posted December 4, 2014 Share Posted December 4, 2014 (edited) The links are now available on the OOS ones as well I wish our IT dept had a 3 minute turnaround on enhancements! EDIT - Looks like the OOS TRU links are to Target. Edited December 4, 2014 by pcaster Quote Link to comment Share on other sites More sharing options...
Jeff Mack Posted December 4, 2014 Author Share Posted December 4, 2014 Because I am interested in this on a technical level: Why isn't 2GB enough? Assuming the servers only do the task of obtaining the information (pinging the sellers), this should not need that amount of memory, not by far - even if they would cache the results for some time. What am I missing? This is being run on MS Azure cloud farm. Its compiled C# code. It is just a little more memory hungry. When you are throwing approx 100+ connections out every few seconds, it just needs it. This is how we did it, I am sure others of you that know python or other things that can run on linux/nginx can come up with something that is more efficient, but I don't know that. There are also layers on monitoring going on the actual crawling that we are starting now looking for signs of it blowing up to alert us or kick off an automatic restart. I am not the fastest in the world of releasing things, but I try to think things out for the future and where I think it may go. I also don't mind spending a few extra $$ if it means that overall the experience is good. Quote Link to comment Share on other sites More sharing options...
waddamon Posted December 4, 2014 Share Posted December 4, 2014 This is awesome, very cool to a non tech guy. Quote Link to comment Share on other sites More sharing options...
tonysbricks Posted December 4, 2014 Share Posted December 4, 2014 Something to consider: sounds like you poll the target pages every second or two, but the page itself is on a 30 second refresh. So ~28*100 polls are wasted per person viewing. You could either slow down polling dramatically (~20s per) or use ajax to update the page and remove the global refresh. This would give people the new data instantly and not waste polls. Quote Link to comment Share on other sites More sharing options...
legokent Posted December 4, 2014 Share Posted December 4, 2014 This is awesome. You've done a great job with this site. Thank you again. Quote Link to comment Share on other sites More sharing options...
matt1147 Posted December 4, 2014 Share Posted December 4, 2014 Great new tool! Thanks Jeff! Quote Link to comment Share on other sites More sharing options...
Jeff Mack Posted December 4, 2014 Author Share Posted December 4, 2014 Something to consider: sounds like you poll the target pages every second or two, but the page itself is on a 30 second refresh. So ~28*100 polls are wasted per person viewing. You could either slow down polling dramatically (~20s per) or use ajax to update the page and remove the global refresh. This would give people the new data instantly and not waste polls. The 30 second page refresh has nothing to do with the grabbing of data. The data is being updated in the background and updating the database. I am just saving people from hitting reload/F5 by refreshing the page. Quote Link to comment Share on other sites More sharing options...
tmock12 Posted December 4, 2014 Share Posted December 4, 2014 Jeff I'm not sure which pages you are scraping. But I found the mobile version pages for multiple sources had their prices updated sometimes a full minute before their web versions. I'm not sure if they are doing some back end caching on their web versions or what's causing that. Just something to look into. Quote Link to comment Share on other sites More sharing options...
Jeff Mack Posted December 4, 2014 Author Share Posted December 4, 2014 Jeff I'm not sure which pages you are scraping. But I found the mobile version pages for multiple sources had their prices updated sometimes a full minute before their web versions. I'm not sure if they are doing some back end caching on their web versions or what's causing that. Just something to look into. Thanks, i will take a look at that. didn't even try to look at those Quote Link to comment Share on other sites More sharing options...
10230 Posted December 4, 2014 Share Posted December 4, 2014 This is being run on MS Azure cloud farm. Its compiled C# code. It is just a little more memory hungry. Interesting. Do you have an idea where all that memory actually goes? What does it need to store that is that large? I would have imagined that the network is the bottleneck - both bandwidth and latency (and with that money) - but not memory on the "slaves". Anyway - great that it exists in the first place. Quote Link to comment Share on other sites More sharing options...
trekgate502 Posted December 4, 2014 Share Posted December 4, 2014 The links are now available on the OOS ones as well Jeff thanks for the feature request! Quote Link to comment Share on other sites More sharing options...
No More Monkeys Posted December 4, 2014 Share Posted December 4, 2014 Thanks, i will take a look at that. didn't even try to look at those It would also take less bandwidth, since mobile pages are like 10 times smaller 1 Quote Link to comment Share on other sites More sharing options...
johnwray Posted December 4, 2014 Share Posted December 4, 2014 (edited) I agree that the "Higher than retail" boxes should be yellow, and the "lower than retail" should be blue. It's hard to see the exclamation points and tags at a glance, which is what we need to be able to do considering how many sets there are. EDIT: I'd also like to see the Amazon column moved all the way to the right. The prices fluctuate, but there's always a unit available at Amazon, so it's not as useful if we're scanning the tracker. EDIT #2: A few more ideas: List them numerically in descending order. It'll make it easier for us to find the information we need, and it puts the exclusives closer to the top. Alternate color or brightness row by row to make the info easier to read. I see that there's a slight variation between each row (white/light gray/white/light gray) but it's not enough to improve the eye's ability to read straight across. Amazing tool, by the way. I really hope the guy who first coded a tracker gets credit and compensation. Edited December 4, 2014 by johnwray Quote Link to comment Share on other sites More sharing options...
wildkarrde5 Posted December 4, 2014 Share Posted December 4, 2014 This is excellent! Quote Link to comment Share on other sites More sharing options...
Sshackshooter Posted December 4, 2014 Share Posted December 4, 2014 Wow excellent tool, thank you very much. Quote Link to comment Share on other sites More sharing options...
xeeeej Posted December 4, 2014 Share Posted December 4, 2014 Very nice tool. Thank you & honorable mention to tmock. Can you make it to sort the list by set number? If that's already a feature, I didn't see it. Thanks! Quote Link to comment Share on other sites More sharing options...
LegoManiacc Posted December 4, 2014 Share Posted December 4, 2014 I like the different shading you just added, that helps. Quote Link to comment Share on other sites More sharing options...
Jeff Mack Posted December 4, 2014 Author Share Posted December 4, 2014 I like the different shading you just added, that helps. Wow you were right on that change Quote Link to comment Share on other sites More sharing options...
cissi Posted December 4, 2014 Share Posted December 4, 2014 This is very cool, Jeff! Now everyone doesn't need to go write his/her own codes! Brilliant! Do you think it'll be possible to sort the list by, say, set number or name? Quote Link to comment Share on other sites More sharing options...
LegoManiacc Posted December 4, 2014 Share Posted December 4, 2014 Wow you were right on that change Indeed! I just happened to be looking at it as it changed before my eyes...like a magic trick. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.