[ Date Index ] [ Thread Index ] [ <= Previous by date / thread ] [ Next by date / thread => ]
Adrian Midgley (Gmail) wrote: > > Writing code to allow a generic network facing interface to receive > questions from an authorised other node on the network and pass them to > a system-specific layer which turned those questions into actions on the > general practice* database is a simple scalable task which has been done > for many systems - indeed the stuff I'm using now has a Windows frame > around a terminal emulator that pretends to the database system that it > is a Wyse green screen terminal. I'd say not simple. You have N back ends, and you need to support that interface when any of those back-ends is upgraded or replaced. Creates a nightmare maintenance headache. Better to define a new system that others export data to or migrate to. The approach outlined would result in a piecemeal system, that wouldn't be useful for something like patient data. As it would always be bailing out to the common feature set - read text or image for this patient. > We have the example of the Internet before us, we know how these large > tasks should be attempted. Depends what criteria you place on it. The really large systems on the Internet that handle these kinds of user volumes are often proprietary (think GMail). Or inherently quite simple (DNS). The WWW for example defines a simple query, answer protocol for moving data with optional security (HTTP), but even for the relatively common case of delivering PDF file, about 50% of websites can't do it in standard conforming manner (declaring there response to be a PDF document), so my client bails out to "save binary data to disk". The common client implementations don't provide a "log out" feature for HTTP Authentication. Much is broken here, both in standards and in implementation. > The SCR is not a tool I expect to find useful even when it works, and I > do not think we have the capacity to make its administration honest. But if you provide the functionality via different back end systems, you still have the same basic problems to solve you just have to solve them repeatedly rather than once. I'd liken it to those people who plan to solve the spam problem by rewriting SMTP. There is a lot wrong with SMTP, and rewriting it can involve creating protocols that are harder to abuse. But until you place some sort of barrier between a new unknown sender and your eyeball, the spam problem is not solved. Because the spam problem is inherent to all messaging systems where you accept messages from senders of unknown origin, and re-jigging the problem space slightly doesn't address that fundamental issue. So the someone unauthorised might see my medical records problem exists. The scope of it is limited because the number of people who can see my medical records is largely restricted to people who know how to access your computer system. Now no matter how you remove this restriction, it won't solve the issue that when more people can access my medical record, there is more scope for it to be done in a malicious fashion. One might argue there are difference of degree in the risks simply because of the numbers. But ultimately I suspect you can't get the advantages without those risks. Interestingly whilst some people have expressed concerns over Inland Revenue and lost records, where the government have just done these things it is largely just accepted as a fact of modern life, and people enjoy throwing the odd stone when they lose data, but probably wouldn't want to pay more tax for a less efficient tax system. -- The Mailing List for the Devon & Cornwall LUG http://mailman.dclug.org.uk/listinfo/list FAQ: http://www.dcglug.org.uk/linux_adm/list-faq.html