[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Mingw-cross-env-list] Mingw in a production environment
From: |
Tony Theodore |
Subject: |
Re: [Mingw-cross-env-list] Mingw in a production environment |
Date: |
Sat, 22 Sep 2012 23:30:37 +1000 |
Hi Volker,
On 22/09/2012, at 4:27 AM, Volker Grabsch <address@hidden> wrote:
> One detail question: How exactly to you update and/or upload new
> packages?
There's an EC2 instance that runs a very crude script every hour - it's
basically just git pull, make download, s3cmd sync [1]. I've added the script
to the tools directory so people can improve it, eventually it could do
something like 'wget --spider' to check for broken links.
> Tony Theodore schrieb:
>> It's just under 1GB, so if the rest of the internet goes down, it will
>> cost 12 cents from S3 or 19 cents from the CDN.
>
> An alternative would be to rent a VServer to a fixed price
> (usually about 5-10 EUR / month) which usually have lots
> of inclusive traffic. In the unlikely case the traffic exceeds
> the limit, it usually doesn't get more expensive, but the
> network speed is simply reduced from 100 MBit/s to 10 MBit/s.
>
> I'm not sure whether I'd really prefer this over S3, but I'd
> like to point out the alternative.
Thanks, I haven't seen a hosting account with a fixed price before. The main
thing I like about S3 is that it's also easy to set up redundancy and a CDN.
> Also, I'd like to discuss whether we should use such a mirror as
> fallback or as a primary source. I think that it is better to use
> it only as fallback, but I'd like to hear other opinions, too.
>
> First of all, we include big packages such as Qt which already have
> their own mirror system. It makes sense to prefer those over ours
> as they probably deal with more traffic and thus have better solutions
> at place than we have.
Agreed, though projects like Apache also have good infrastructure, they also
have a convention of only keeping the most recent version on the main site and
moving previous versions to an archive directory (breaking all links in the
process)
> More importantly, however, I think it is very valuable to notice if
> some maintainer uploads a newer package version at the same URL. Of
> course, this is a very bad practive (they should always release a
> new version instead). Nevertheless, in case this happens we almost
> certainly want to use the upgraded package instead. We wouldn't
> notice if we'd always download from the mirror first.
The way it's currently implemented, it would always fetch from the primary
source and then distribute that, so this case should be covered (though I don't
know the intricacies of s3cmd [1]).
I think it would be very cumbersome to try and use our mirrors first, we'd have
to use different logic in the "update" and "download" Makefile targets and
possibly add a new target that uploads the new files. Scratch that, all we'd
need to do is change the order of the wget lines.
Cheers,
Tony
[1] http://s3tools.org/s3cmd