This page is a wiki. Please login or create an account to begin editing.


40 posts / 0 new
Last post
Offline
Joined: 2009 Apr 18
Alternative ULs/distribution of BIGGER games?

Sorry if this has been covered in a topic elsewhere on this forum!
I'm trying to figure out a easy/working way for the games over the limit of 400MB, which is fantastic that it could be done. The ideal solution would not take resources from this server, be accessible under the same conditions as this server and be run with the same security as this server. It would/should be accessible crossplatform, so no mysterious, no matter oh-so-effective, protocol and encryptions running on only one platform.
• The first thing springing to mind is a t0rrent network; the strain would be shared by the people participating. I´m not sure however how much it takes to set up the distribution of the actual t0rrents.
• Second thing is more retro fashion; revive the HotLine era. Clients/servers exists for most of OS:es. They can even be ran from inside BasiliskII/Sheepshaver! Wink Compatibility with newer system might be a snag and the accessibility as each unique file can only be accessed from one server at one time. This often means looooong DL times.
• Third thing is using the file hosting services; like Rapidshare/Hotfile/Mediafire etc. These are often crippled when it comes to file size, DL speed and accessibility to those not paying. And the files need to be accessed/refreshed on a regulary basis. But the strain is put elsewhere than here.

This is only my thoughts. Anyone having more/better/cheaper solutions are welcome to put their suggestions up here. Smile

Comments

bertyboy's picture
Offline
Joined: 2009 Jun 14

Maybe an option or two, even harder than those already suggested:

Don't know how you would go about convincing something like info-mac to host the larger files / all the files. The total disk space required would be something the info-mac archives are not used to, I think they topped out at somewhere around 10GB when they stopped mirroring every night. There are still some mirrors, but I think the cost of the mirroring of all the changed content was too much.

Or something from the likes of Google, they appear to have lots of disk available for email and other services. But again I don't know if there is anything suitable. I'm not even sure of how much disk the site would need, take whatever is there now and multiply by ten. You'd have to consider practically every bit of Mac software since 1993 until 2000.

Whatever solution, it's a step away from retaining control over the site, from ensuring that there are nightly backups, and that they're usable. From the provider going under and the site losing all the files (and the backups).
It also makes it a full-time job keeping out warez. I'm scurrying off more and more often to check the suitablitiy as "abandonware" of some of the uploads already, ie. Adobe still provide updates for Acrobat Mac as far back as v3. Should we be hosting Acrobat 5 Professional ? Do we consider Acrobat v5 unsupported ? Glad I don't have to decide.

Balrog's picture
Offline
Joined: 2009 Apr 24

There aren't any guidelines for warez yet, so I'm only removing what's obvious.

I was thinking of a system in which you upload a file to the server, then the server generates a .torrent that it tracks and seeds at a throttled rate. This way the file will never disappear [as is often the case with torrents], but it won't put a terrible strain on the server either. Also backups would be easy.
However, this would require more server space on the server itself. The current server is only 10GB, and most of the files are on S3 storage, which can't be used with such a scheme.
I priced out dedicated servers with close to 1TB of storage, but it would cost close to $120 a month.

Hotline still requires servers, unfortunately Sad And Rapidshare / Mediafire / etc are not reliable at all.

Offline
Joined: 2009 Apr 18

Just to clearify myself and my thoughts; I´m in no way trying to move things away from this place/server. Just trying to find ways to expand the possibilities for the community and occasional visitor. The limits here are clearly stated, respected and understod. So are people de facto working with the site and server. Smile

bertyboy's picture
Offline
Joined: 2009 Jun 14

Sorry SwedeBear,

Not flaming you for the discussion, just (hopefully) adding more to it, even the drawbacks. We will find a way one day, someone will have an idea.

Offline
Joined: 2009 Apr 22

Why not use the 7zip format? It compresses files greatly. It has a MacOS X program (7ZX), and we could make non-compressed disk images with disk utility or disk copy - and then use the 7zx program to compress the disk images considerably. 7ZX even supports Macintosh resource forks (a selectable option in the program settings). This could help to make some games fit the 400MB limit.

MacWise's picture
Offline
Joined: 2009 Apr 29

7zip compress more than regular zip, but there's no Classic version. For emulator users this is no problem, but what about others?

Offline
Joined: 2009 Apr 22

but what about others?

For those who only have access to a classic mac for instance - it wouldn't matter much anyway. My 867MHz dual G4 can use 7z - but it compresses slow. It wouldn't be feasible to un-compress with an older system (pre MacOS X). I have both - and could use my current computer to compress and uncompress 7z files.

I would of course only recommend this for those big disk images that won't fit 400MB.

There is an alternative - but Maedi isn't too fond of it - using stuffit to segment archives. Stuffit 5.5 can segment and join files in specific sizes. Say a 700MB CD image - make 2 parts of 350MB each. You would need two listings in Macintosh Garden to have both parts for download - and you would need stuffit to re-join the files back into the original archive (containing the disk image, etc).

The first option is much less technical and easier for users - but isolates those with only access to a classic mac (although I don't' think this is much of a problem - but I could be wrong here).

The second option would work under classic systems (stuffit 5.5 is supported in classic systems) - but would require more work and the administrator is not fond of having more than one page for one game.

Choices, choices... fun, aren't they?

--Rob

MacWise's picture
Offline
Joined: 2009 Apr 29

There is an alternative - but Maedi isn't too fond of it - using stuffit to segment archives.

The more I think about it, the more sense it makes to me to split CD images bigger than 400 MB, because even with 7-Zip some images won't fit anyway, and the multipart SITs can be stored at the Amazon server along with the standalone SITs.

Maedi might not be fond of the idea because in the early days of the Internet "big" files were distributed that way and it was troublesome. A 20 MB file would be splitted in 15 floppy-size segments. If one segment was missing or corrupted (a common scenario) you were out of luck. Today things are different. A 650 MB file only needs to be splitted in half. That's no big deal.

Offline
Joined: 2009 Apr 18

@ bertyboy: no harm done or taken. Wink

Actually we maybe should focus on the fact that alternative compression algorithms doesn´t make any impact on the number of titles, but on the total stress on the server and bandwith, by sqeezing in one specific CD image under the MB limit. Setting focus on alternative storing with access from this site ought to be the main thing IMHO. Smile

Balrog's picture
Offline
Joined: 2009 Apr 24

7zip is not that much better than .sit or .zip.

I'm sure using torrents [as detailed above] would be better, but we'd need some way to pay for the server space.

Balrog's picture
Offline
Joined: 2009 Apr 24

7zip is not that much better than .sit or .zip.

I'm sure using torrents [as detailed above] would be better, but we'd need some way to pay for the server space.

Balrog's picture
Offline
Joined: 2009 Apr 24

@MacWise: That's not the reason ... Maedi doesn't like that because of bandwidth and storage costs. Bittorrent moves the bandwidth costs to the downloaders, so that's why it's my preferred solution.

Offline
Joined: 2009 Apr 22

So from all of the discussion - here is a summary:
* Use BitTorrent via use of our own server
* Use alternate compression methods
* Use splitting / segmenting techniques

Each idea works from a technical standpoint. From a practical standpoint I can see this website starting to get big and costly bandwidth wise. It might possibly get so big that the site is closed due to bandwidth costs - or requires paid membership - or both.

The most cost effective way for everyone would to have some kind of BitTorrent server in place. But this in itself has challenges. How do you update a file - say someone has a newer version or a patch or something... There has to be some way of killing off an "out of date" torrent with an updated torrent. Could we just delete the old torrent from the server so that current seeds / leeches won't be able to finish download - so they have to return to the website to get the current version?

What about a "backup" copy - in-case a torrent dies - so that it could be revived. Someone would need to keep backups of the games which are transferred via BitTorrent so that it wouldn't be lost should all seeds leave, the file gets corrupted at the one seed, etc...

Oh - and for those of us uploading currently - how about a checksum file inside the archive? I think it would be a good thing to have - thoughts on this?

MacWise's picture
Offline
Joined: 2009 Apr 29

@Balrog: I see. In that case Bittorrent seems like the best alternative.

Offline
Joined: 2009 Apr 18

What about a "backup" copy - in-case a torrent dies - so that it could be revived. Someone would need to keep backups of the games which are transferred via BitTorrent so that it wouldn't be lost should all seeds leave, the file gets corrupted at the one seed, etc...

The thought of backups is important no matter which way we choose to distribute bigger files. Most of the community supports it by UL unique files. If each one could publish a list over the files/disks/floppies they are willing to handle 'backups' for the task could be spread and hopefully easily overtaken when someone decides to leave or step back. The alternative spells in TBs and someone/s with resources to regulary scan the files to discover any failure and put up the responding backups.
Thoughts?

Balrog's picture
Offline
Joined: 2009 Apr 24

The file would be uploaded to the server by regular means. Then, software on the server would create a torrent file for the original file, stick the torrent file in the download field, and seed the torrent at a low rate (15kbps or less). The original uploader will still want to seed because while such a slow rate is good enough for preservation purposes (wait long enough and you'll get it), most people don't want to wait. This would also make it easy to do a backup of all the files, probably using rsync.

I'd do this for files larger than 200MB.

Offline
Joined: 2009 Mar 21
There is an alternative - but Maedi isn't too fond of it - using stuffit to segment archives.

The more I think about it, the more sense it makes to me to split CD images bigger than 400 MB, because even with 7-Zip some images won't fit anyway, and the multipart SITs can be stored at the Amazon server along with the standalone SITs.

As Balrog said, it's about bandwidth. If someone goes to download part 1 and then part 2 of a 700mb game, it's still 700mb's of bandwidth instead of the 'ideal' 400mb (400mb is too much by the way).

I just checked our Amazon S3 usage and it's jumped to $79.10 USD from $18.23 USD last month. (451.481 GB transferred this month). The server also costs $40 USD a month. We've had around $200 USD in donations since the Mac Garden was re-launched, but as you can see, the monthly bill is unsustainable.

I think that for games that are big & popular, a torrent solution is ideal. Hosting torrents on Mac Garden will not fix the common problems associated with torrents. It could add much complication and there would be more things to go wrong.

Right now this website is simple and easy to maintain, cost is our biggest worry, it's currently not sustainable. The 2nd biggest worry is complicating the site further, all the options are still valid, the simplest option is usually the most effective. For example, bringing the download limit down to 200mb's and hosting every other file with bittorrent would probably save a lot of money.

Balrog's picture
Offline
Joined: 2009 Apr 24

Why would hosting torrents on our server be a problem? The throughput rates would be throttled very low, and the copy stored on the server would be primarily for backup purposes. If necessary, we don't even need to seed from the server by default, but only if someone requests a reseed.
To prevent abuse we could have a system that requires a downloader to seed for a certain number of hours after the file finishes, and if he doesn't do this, he gets suspended (from downloading more) for a few days. There's one torrent site that does this (Maedi, please email for details) and I feel that it's fairer because people with fast connections don't get a big advantage.

I will see what I can do, but my major worry about an external torrent site / server is that files will get lost.

Also: some measures should me put in place to prevent people from wgeting the whole site. I think this is happening a good bit. Sad

MacWise's picture
Offline
Joined: 2009 Apr 29

Balrog and Maedi,

You got my vote. I just wonder how many people will actually seed those files. I can't do it for long because my Windows XP laptop is shared, not to mention it has the tendency to overheat. The machine that I could use for that is the old Mac Box but there's no BitTorrent for Mac OS 9.

Offline
Joined: 2009 Mar 21

my major worry about an external torrent site / server is that files will get lost.

I think the issues you see with torrent sites would be:
Legality
Seed Leech ratio
Files disappearing

These issues will still exist if you move the torrents over to the Mac Garden server. I don't like the idea of putting all your eggs in one basket, having 30GB's worth of games on a server. BitTorrent is associated with a legal activity, MacGarden walks a fine line between copyright infringement... I host other sites on this server and I don't want them to be caught up in a potential legal issue down the track.

Hosting torrents on Garden, the issues would be:
Legality
Seed Leech ratio
Files disappearing & therfore self-seeding and hosting files on our server

Currently there is not enough space on our server to host large files. And as I said, keeping files decentralised is a smart move for the long term. However I can't afford S3's current monthly bill & if storing files on the one server solves that, great! But it does get messy, which files go where, we've already set S3 up. A download cap could solve our problems instantly...

MacWise's picture
Offline
Joined: 2009 Apr 29

A download cap could solve our problems instantly...

I was going to suggest that but I held my tongue. What kind of cap are you thinking of? Low bitrate or download limit per month?

bertyboy's picture
Offline
Joined: 2009 Jun 14

187GB bandwidth in the last month ? just for the Amazon hosted stuff ?

A most I'll use perhaps 2GB, uploading a few 400MB images uploaded.
That means there's a lot of people doing a lot of downloading.

I was going to download, repack and upload the Civ II stuff (since I trashed the toast images), I'll re-cut them from original media, now back in the dark corners of my bookcases, somewhere.

Balrog's picture
Offline
Joined: 2009 Apr 24

BitTorrent is used for legal purposes too. Some Linux distributions use it for updates, many games (World of Warcraft [iirc], certainly others) use it for updated, and (in my opinion) it's certainly better from a legal standpoint because we're only tracking a file, not uploading. If we upload then it would be about the same as what we have now. As long as we respect takedown notices and don't do what The Pirate Bay did (they would make arrogant replies to such letters and post them immediately), I don't see why a problem would exist.
Some countries have been trying to completely block the protocol, but the legal uses get hampered when this happens.

If we did use torrents, I'd institute a system in which people have to seed for a certain number of hours after downloading or be banned from downloads for a few days. I think this is better than the share ratio system.

Also, registrations would have to be more tightly enforced ... Maybe require invites and allow people to request one via the web or via IRC (this is for torrents only, regular downloads would be accessible by any registered user).

I'll look into the Drupal module.

MacWise's picture
Offline
Joined: 2009 Apr 29

Hey Maedi, if you're still thinking of a download cap, I ran a test with ViaHTTP and successfully stopped and resumed a download several times. The resulting file decompressed and ran with no problem (it was Blackthorne). So if members are limited to, lets say, 50 MB per week or 200 MB per month, they'll still be able to download big files as long as they accept the cap. If they don't, then they don't. Nobody is forcing them to download all this great abandonware stuff. This also solves the problem with the upload limit. It's no longer necessary, because no matter how big the file (or the files in case of multi-disc games) the user will always download no more than the allotted amount of megabytes per month. Later on, as the site's popularity winds down, the cap can be relaxed or even removed completely.

And Balrog, I know you like BitTorrent. When you explained to me why Maedi didn't wanted 400 MB+ files, I agreed with you BitTorrent seemed like the best alternative. But after reading things like "Hosting torrents on Mac Garden will not fix the common problems associated with torrents" and "Some countries have been trying to completely block the protocol" I'm having second thoughts. And I still have doubts about seeders, even if you force them. Remember what happened to Lime Wire? It could be the same thing here. But if you find a way to make it work, then more power to you man. I'm just putting my two cents, not trying to oppose you.

IIGS_User's picture
Offline
Joined: 2009 Apr 8

Such a method sounds good, are there versions of ViaHTTP for recent Operating systems?

Anonymous

i use a cell phone for my modem with a 2 hour battery life, so of you did use torrents i could only seed for 1 hr. Also please don't have download caps like vetusware.com. In case you haven't been to vetuswares web site. here's what they do. they only let you download one file a day unless you upload files to there web site. This really stinks for the mac crowd how use it to get o.s.'s for the emulators.
Thats my two cents

Balrog's picture
Offline
Joined: 2009 Apr 24

About what some countries are doing: usually there's a way around this. Having SSL-encryption on the tracker (self-signed, no need to pay for the cert) combined with bittorrent encryption solve most of these problems. About peck of seeders, throttled web seeding would work but the Drupal module for that is still very 'alpha' ... I probably could figure it out anyway.

I don't think a download cap is possible with S3 ... Someone can go to the S3 root and parse the XML file anyway. Also download caps are VERY annoying.

About seeding requirements: none of that is final, and changes can be made at any time.

Attila's picture
Offline
Joined: 2009 Apr 22

Macs are easy to use. Web browsers are easy to use. Torrents not so much. I use public access computers for internet, and torrents and file sharing will never be available to me because software cannot be installed on these public access computers. I'm not saying you shouldn't do something in that area; I'm just saying be aware of its limitations and that it will inevitably shut some people out. Not to mention it's so much easier for the average person to find something like the Macintosh Garden on the web than it is to find some obscure file sharing system.

Balrog's picture
Offline
Joined: 2009 Apr 24

@Attila: they even block portable apps that run from flash drives? There is a portable uTorrent, FYI.

In any case, only larger files would be available this way; for smaller ones the savings is very small to none.

Attila's picture
Offline
Joined: 2009 Apr 22

I've tried everything for torrents and such. Flash drives don't help. I always get messages telling me I don't have network administrator's privileges and can't get any further. Even if the software's already installed and I'm just trying to run it. I'm probably better off; would just waste a lot of my time going through that endless supply of stuff out there. I don't really have time for the Mac Garden as it is!

MacWise's picture
Offline
Joined: 2009 Apr 29

are there versions of ViaHTTP for recent Operating systems?

Both Safari and FireFox have built-in download managers, and even though they're simple they're free and you already got them. If you want something better there's a FireFox add-on called DownThemAll that's all the rage right now, so check it out. There are also stand-alone programs like Speed Download and iGetter, but they're shareware.

I've tried everything for torrents and such. Flash drives don't help. [...] can't get any further. Even if the software's already installed and I'm just trying to run it.

Now that you mention it I remember something about BitTorrent being blocked on public terminals by nasty administrators, but I don't know why. Maybe it's because BitTorrent is associated with illegal activities. Anyway if we stick to browser-based downloads there could be a solution for your dilemma. Wanna give it a try? Get Free Download Manager and once you have it running, select from the File menu Create Portable Version and install it on your flash drive. You should be able to use it on any public terminal.

i use a cell phone for my modem with a 2 hour battery life, so of you did use torrents i could only seed for 1 hr.

Which is precisely my point: BitTorrent needs seeders. It needs plenty of them and in a constant fashion. If we're all going to seed every now and then for short periods of time, it's not going to work.

Download caps, as annoying as they are, work because they're not optional. They keep free servers free.

Balrog's picture
Offline
Joined: 2009 Apr 24

Let me repeat: I don't think a download cap is possible with S3. Someone can always go to the S3 root, parse the XML, and download all the files at once.

Since the files are on Amazon's servers and not ours we can't really do download caps. (Maedi, correct me if I'm wrong.)

I'm not sure what @Attila was trying but there's almost always a way around such blocks.
And how are you going to enforce a download cap? By account? That would lead to proliferation of accounts, which is very bad for us. By IP? Most people have dynamic IP anyway.

Throttling would be easier, but by itself it won't help too much and
it would be terribly inconvenient. We probably could put have both bittorrent and throttled (~30kbps maybe) HTTP available for the same files though ... This wouldn't increase complexity much from a simple Bittorrent based setup.

This all assumes that we have full control over our files. As long as try are on S3 none of this
can be done.

MacWise's picture
Offline
Joined: 2009 Apr 29

This all assumes that we have full control over our files. As long as try are on S3 none of this can be done.

S3 supports BitTorrent, so there's no need to host files elsewhere. The S3 FAQ says, "Simply add the ?torrent parameter at the end of your GET request in the REST API." But I'm going to say it for the last time with all the good intentions in the world: I don't think it's going to work, not for some technical reason but because people are not going to collaborate. I could be wrong, and if BitTorrent happens, I hope I'm wrong. The last thing I want is Macintosh Garden going down again.

While the S3 FAQ mentions nothing about supporting caps, it says, "(S3) is designed to be highly flexible [...] (developers can build) a sophisticated web application such as the Amazon.com retail web site." So a cap is possible and shouldn't be discarded without further examination. I believe this is a better solution to the leeching problem because it requires no collaboration from the user, who's causing the problem in the first place. If anyone else has any other idea that doesn't rely on user collaboration then lets hear it.

Balrog's picture
Offline
Joined: 2009 Apr 24

I'd probably go with a combined method: caps for direct HTTP downloads, and no caps for torrents. I'm currently testing S3's torrent system, but I'm not sure.

In any case, when using S3/Bittorrent, the file still resides on the S3 server IN FULL and user collaboration isn't 100% necessary.

MacWise's picture
Offline
Joined: 2009 Apr 29

In any case, when using S3/Bittorrent, the file still resides on the S3 server IN FULL and user collaboration isn't 100% necessary.

So the file isn't lost when the cluster die. That's great. And you agree user collaboration is necessary, though not 100%. OK. I respect your opinion. What's more important, we're making progress. Big smile Cool

Balrog's picture
Offline
Joined: 2009 Apr 24

At no point I tried to suggest that the file would get lost. In fact, I said it would be necessay to have the web server seed to prevent that from happening!
It's good that we're on the same page now. Maybe if more people showed up on IRC (and stuck around) there would be less confusion in general ... I'm there much of the time Wink

MacWise's picture
Offline
Joined: 2009 Apr 29

The doctor told me to stay away from IRC. If I ever get in touch with that thing again I may chat my life away! Laughing out loud

Offline
Joined: 2009 Mar 21

I've tried everything for torrents and such. Flash drives don't help. I always get messages telling me I don't have network administrator's privileges and can't get any further. Even if the software's already installed and I'm just trying to run it. I'm probably better off; would just waste a lot of my time going through that endless supply of stuff out there. I don't really have time for the Mac Garden as it is!

I think for games over 200mb's we'd use torrents, so downloading games of the site would still be very intuitive most of the time.

Download caps, as annoying as they are, work because they're not optional. They keep free servers free.

Still an option, depends how succesfull bittorrent could be.

In any case, when using S3/Bittorrent, the file still resides on the S3 server IN FULL and user collaboration isn't 100% necessary.

So the file isn't lost when the cluster die.

What does happen with the "?torrent" method, is the file 'disabled' by S3 when there are no more seeders?

Balrog's picture
Offline
Joined: 2009 Apr 24

With the ?torrent method, S3 seeds if there are no seeders ... Which costs us the same as if the downloader used HTTP to download the file. Download caps would be very hard to implement with S3 ... They would probably require custom PHP coding, and the S3 permissions system isn't flexible enough to have both caps and bittorrent.