Welcome. Get your technical questions answered and offer your help too!!!

Use EcoCash to buy NetOne & Telecel airtime online. Tap here
advertisement
advertisement
704 views
in ICT Services by Guru (88.1k points)
We are just about to built a private 10G fibre network across 3 continents. The major part of SLA is delivering and syncing about 4petabytes of data in 2 data centres which are located across the the ocean. This is so for the purpose of backup just in case a bomb or aeroplane lands on one data centre.

We have been in the market looking for WAN acceleration and Aspera ( http://asperasoft.com/ ) is the cheapest we can find at $400 000.00 per node licence!

Any software or open source out there you know?

Use EcoCash to buy NetOne & Telecel airtime online. Tap here
advertisement

1 Answer

+1 vote
by Expert (17.4k points)
selected by
 
Best answer
What are you trying to accelerate on the wan ? I am pretty much sure you can run vSphereĀ® with Operations Managementā„¢ & have a solid solution without loosing an arm and a leg. If you want to accelerate apps on the wan then you could check riverbed technologies. For a global insurance firm we used Steelhead EX Series & it integrates so well with VMware SDDC.
by Guru (88.1k points)
We want to build a dark fibre 10g network which can easily be upgraded to 40G fibre by just changing gbics.

Its a simple file transfer, l will be using a simple rsync maybe.

Bt here is trick bit. A exploration boat might dock on one site in Europe. It might have maybe 1petabyte of data on magnetic tapes which can then be delivered to us.
We will copy all that data to disk and and QC it. Once we are happy we then have to transfer all that data to the client through private wan.

Thus where Wan acceleration kicks in. It have to be done as quickly as possible and efficiently. We want a solution which takes away tcp overhead .
by Expert (17.4k points)
if your networking equipment can support jumbo frames then you are good to bypass the tcp overhead and thus improved efficiency &effectiveness. imho for wide area and stressed networks rsync can be a challenge because it provides less resilience. depending on how well planned and thought i guess this could be a good idea but the agility therein is somewhat limited
by Guru (57.6k points)
rsync was designed for file transfers over LAN/WAN, its designed to be mostly resilient...
by Expert (17.4k points)
Anthony from enterprise experience working with rsync there is extremely limited resilience of rsync working with huge data.

Always an issue of instability, running continuous work on huge capacity of data.there are no universal recipes for all servers and all data types & i know rsyncing about 4 petabytes is not anywhere near a wise thing using standard rsync.

as the data grows the absence of trustworthiness (dependability, security, performability) and tolerance (survivability, disruption tolerance, and traffic tolerance) is clearly visible. rsync is generally good for hobbies but when it comes to enterprise computing you need more than just the standard rsync. I stand to be corrected & i am yet to hear success of rsync running at least 1PT work load with out issues.
by Guru (88.1k points)
I will have to test and get the results across the Atlantic. Internally, l have used it to reliably tranfer at times over 100 terabytes of data. It is one of my trusted tool nowadays.
by Expert (17.4k points)
When dealing with huge data rsync craps in it's pants after a couple of hours and things don't seem to add up. Try it & who knows, maybe you might pull an elephant from the hat
by Guru (88.1k points)
When you say big data what are really talking about?

I have run it for more than a week transfering over 70tb of data without drop or restart.

Im interested to know where it let yu down
by Guru (57.6k points)
yeah i deal with servers daily and use rsync almost religiously for file sync - its reliability comes down to 3 things cpu or disk performance/capacity and the reliability of your link between the servers - you get around the latter one with options like --partial and the beauty of rsync is that its designed to repeat and resume for such scenarios without issues

i guess it does depend on your dataset - Macdonald you mention 4PB of data but you dont give anything like average file size or number of files - rsync does struggle speed wise with millions of tiny files (but any network transfer tool does really)
Welcome to Techzim Answers,

You can ask questions and receive answers from the Zimbabwean internet community.

If you're not sure how to proceed from here just click here and ask
...