I have just started a Tech start up with a group of my friends. Doing Virtual reality for architecture.
We all have our own beefy computers and have just brought a super Mini itx portable computer suitable for VR. However this computer is so small we could only fit in two ssds and we would like this be a machine that could render and create content aswell. but the limited space doesn’t allow for us to use it this purpose.
Also as we are yet to have an office space we are currently working from different locations within the same city. Which makes working on the same projects difficult as we all have our textures and renders saved locally to our computers.
We all have fibre connections at about 48mb/s I would like to potentially set up a nas drive or something similar that could be accessed and worked off remotely where we could save all our textures and renders to one place.
I know very little about networking but im very keen on any inputs from the community to overcome these problems.
Not knowing very much about networking can make solving an issue like this problematic, as there are numerous solutions to the problem, some which will work better than others for you and your team.
Personally, how I would set it up would be to create a server on your domain with a proper network share folder, and then you can have all of your remote client machines connect to the domain server through a VPN tunnel and they can save all of their work on the network share as if they were working locally. You can then backup the shared folder as you normally would. As an added bonus, you could set that server up as a source control box and use it in the same manner.
Since that method requires a fair amount of networking knowledge and setup/maintenance, it may not be the best method for you. You might be more interested in a service solution like Dropbox or Google Drive, but it’d require uploading/downloading all of the necessary files constantly.
If you have the ability to set up an FTP server, you could solve this problem that way, too. Simply FTP the contents of your directory onto the server when you’re done working. Make the FTP server available publicly, or assign credentials to those you wish to have access.
Would I be able to work from an ftp server on an 48mbps Internet download connection? The upload speed is alot less then that so what i wanted to do was have the projects textures assets etc on the server and then render locally to computers and back them up later that was so render times arnt being bottle necked by upload up speeds of about 15/20mbps. but my concern was having to render textures and etc from an external source in real time? Would that work?
Another idea I had was to get everyone a raided portable usb 3 hard drive formatted with the same drive names and have the whole drive a Google drive. This way it the drives would still be separate from the computer and would still be locally stored for speed. However this method is very price/performance ineffective as if I were to buy four raided mirroring 8tb hhd in reality we would have brought 32tb all together but the raid would bring that down to 16 and Google drive syncing all the drive with all the same data would bring that down to 4tb as a collective of usable space.
So I guess my biggest question is what kind of limitations would I see from working off a ftp server
Hey there… Mega.nz (50GB free) or Dropbox is a good solution for general file-sharing, etc. There’s also the usual suspects like Google Drive and Microsoft OneDrive.
That should help kick off file sharing, using the service your team likes the most. But down the line it looks like you need a proper source control system with large file support like Github LFS or Perforce (which people here say is quite suitable for game development).
You should ~never~ render off the public WAN cloud server IMO… That’s usually best for private cloud and even then only within a defined workload/ data wrangling setup. I would not recommend FTP for source-controlled 3D project workflow.
I think the first step is to use source control and render/develop locally, then push/pull from the server.
So to summarise, I propose:
Begin exploring using regular cloud sync and sharing like Dropbox or Google Drive.
Download and use Perforce trial to get a feel of source control for your team including with large files.
Use a suitable system which is hosted in-cloud (public WAN cloud) or host your NAS (network attached storage) in your own server (private cloud) which people will use.
How much storage are you looking at? I would recommend getting a NAS device and a separate rendering device. The NAS will better handle the storage side of things (drive speed, redundancy, plugins, etc). The rendering device would be independent of the storage. You don’t want to have one go down at the cost of the other. I use a Thecus N5550 and while I currently 5x 3TB Western Digital Reds, you could probably use the 6TB, which would give you somewhere around 22-24TB of usable space. You’ll also need to consider some sort of backup solution, so either pick up a second one and offsite it, or look into cloud solutions for storage.
This will allow you to set up your renderer and scale out appropriately if you need to turn it into a render farm, without tying it directly to your storage/backup needs.