I am developing a small application to track contacts with a group of project developers (approx 250) including tombstone data and records of telephone calls and correspondence as well as their staff members (~350), the status of the projects themselves (~150 submissions to date), and the status of the contracts arising out of successful proposals (~50). As well there are reminder items and insurance policy tables and so on. We have 6 users who will likely have the database open but inactive through most of the day.
I have split the database into a front end and a back end, and until today I have been distributing the front end by e-mail and keeping the back end on a network share available to all users, and this has worked to date. The back end has been replicated and the users tables are linked to the replica. If continued development requires a change in the table layouts or new tables I make the changes in the design master, synchronize it to the replica on the network drive, and distribute a new front-end if required.
I do most of my development work on a front end linked top a copy of the design master that I have placed on my local drive. Today I noticed a dramatic difference in performance between the local and network back end when I linked to the network file. It was horrible – and I think the users are going to feel the same way. If they don’t like it they won’t use it – and that defeats the whole point.
I am considering replicating a local back end onto each user’s hard drive, and including some ontimer procedures to check the time every -say- 300 seconds and if it is more than 15 minutes since a synchronization with the network back end, forcing a sychronization through JRO. Does anyone have any experience with that sort of set up? Good, Bad, or Ugly?
Any enlightenment appreciated.