Coolidge Corner's film pages are taking a snooze a lot. I guess this is the point where I am willing to send them an email so some computer-toucher can have a look at what's going on there
hmm, or maybe it's only some of the pages for the Kurosawa series?
what the heck, I'll see if the issue keeps happening once the series is open, and then I'll reach out if it still is
well, fixing the caching took a while, but now that's sortedβ¦ I hopeβ¦
In the process I also confirmed my theory from yesterday that Regent Theatre's API nonce changes on a daily basis. Thankfully, it's in the HTML served by the schedule root page, so it's just another web request and a pattern-match.
There is I guess potential for TOCTOU with that nonce if the program execution crosses the day boundary, but that's easy: I will "just" put this program on a timer that avoids that problem :D
Coolidge Corner's film pages are taking a snooze a lot. I guess this is the point where I am willing to send them an email so some computer-toucher can have a look at what's going on there
several seconds later: ah, crap, I just realized that I'm missing the serialization of one of those types
well, fixing the caching took a while, but now that's sortedβ¦ I hopeβ¦
In the process I also confirmed my theory from yesterday that Regent Theatre's API nonce changes on a daily basis. Thankfully, it's in the HTML served by the schedule root page, so it's just another web request and a pattern-match.
There is I guess potential for TOCTOU with that nonce if the program execution crosses the day boundary, but that's easy: I will "just" put this program on a timer that avoids that problem :D
okay, I have done the needful and now have a re-usable cache mechanism, so I can stop writing that half-a-dozen lines again and again
I settled on being okay with caching the "showings by date" internal structure that is the output of each provider before they go to the global gather. It's discarding a lot of data from the cache, but that's⦠fine.
The cache is really just there so we don't hammer the upstreams, and if I really want HTTP-layer caching, I can go get someone else's solution for it and plug that in.
several seconds later: ah, crap, I just realized that I'm missing the serialization of one of those types
In addition to their API being⦠idiosyncratic⦠the Southeast Asian showings at Apple Cinema are also an interesting edge case. These are separate showings and would not fit into the false dichotomy of dub/sub that one might be tempted to adopt for this domain:
Coolie (Tamil)
Coolie (Telugu)
Coolie (Hindi)
I could see the case for collapsing those to a single listing (especially if the languages could be combined) but I don't think I will bother for now, it's not as big a problem for the reader as mass-market movies are
okay, I have done the needful and now have a re-usable cache mechanism, so I can stop writing that half-a-dozen lines again and again
I settled on being okay with caching the "showings by date" internal structure that is the output of each provider before they go to the global gather. It's discarding a lot of data from the cache, but that's⦠fine.
The cache is really just there so we don't hammer the upstreams, and if I really want HTTP-layer caching, I can go get someone else's solution for it and plug that in.
enhance
In addition to their API being⦠idiosyncratic⦠the Southeast Asian showings at Apple Cinema are also an interesting edge case. These are separate showings and would not fit into the false dichotomy of dub/sub that one might be tempted to adopt for this domain:
Coolie (Tamil)
Coolie (Telugu)
Coolie (Hindi)
I could see the case for collapsing those to a single listing (especially if the languages could be combined) but I don't think I will bother for now, it's not as big a problem for the reader as mass-market movies are
In which I forget to de-dupe
enhance
but spewing 155 files to disk every day, on the other hand, that's another level of obnoxious. It's only 3 MB of data (on the other hand, it's 3 MB of data!) but yea, maybe I should collate these requests and serialize the function with the loop that creates them.
In which I forget to de-dupe
that N*M is 155, by the way. not really enough to justify being annoyed about it, but enough to be annoying
but spewing 155 files to disk every day, on the other hand, that's another level of obnoxious. It's only 3 MB of data (on the other hand, it's 3 MB of data!) but yea, maybe I should collate these requests and serialize the function with the loop that creates them.
I probably need a caching implementation that I can re-use between providers. I keep re-writing the simple parts of that right where it's needed.
that N*M is 155, by the way. not really enough to justify being annoyed about it, but enough to be annoying