falls Ihr Euch für die "dunklen" Seiten von R6 interessiert, eine Kopie eines Mails, was ich vor einigen Tagen erhalten habe...von einem recht "engagierten"
Notesfan, auf 2 Postings wg. Längenlimit vom YaBB Forum verteilt:
OpenNTF.org
http://www.OpenNTF.orgThe Secret Ingredient - Volume 1, No. 3
The future of Domino and client side programming
by Nathan Freeman (mailto:Nathan.T.Freeman@OpenNTF.org)
I love the Notes/Domino platform. I just want to be clear on that, because
sometimes I catch heat about being too hard on Lotus. It's a tough love, I
suppose. I can't really hug a division of IBM. I can give them an
occasional pat on the back, but it's just not the same. Like a parent
scolding her child in the grocery store, observers only see the negative
feedback, and miss out on the smiles and laughter on all the other aisles.
So let me be very clear: I love Notes -- and I owe its creators more than
I'll ever be able to repay.
Now that that's out of the way, on to the spanking. (Insert Monty Python
reference here.)
I'm really worried that IBM's missing the boat. I've been publically
critical about the whole Garnet/WebSphere thing for quite some time, so my
concerns in this arena are pretty widely known. Indeed, addressing those
concerns was a major motive in the start of this site. But even if JSP and
servlet support were delivered in the most beautifully engineered and
integrated package that any of us could hope for, and it was free, and had
an instant learning curve, and could run efficiently on a 286, I'd still
feel like IBM was missing the boat. They would just have a really kick-ass
canoe.
The problem with the entire J2EE initiative -- and to be inclusive, .NET,
too -- is that it focuses on the wrong problem. It focuses on building
bigger and better web services farms, componentizing the hell out of
everything (a programming model with a long and dubious history, we should
note) and providing beautiful diagrams of the separation of presentation
from logic from data. By trumpeting standards, it chases the integration
brass ring. We all hear about best-of-breed tools and "n-tier
architecture." CIOs look at pretty graphs and rearrange entire departments
in response. Developers claw each others' eyes out in ASP vs. JSP and Java
vs. C# throw-downs.
I say it's all trivial.
The real sea change isn't to be found in the back office. We've been
building systems to integrate disparate data sources for web delivery for
years. Getting real-time inventory data is not a technical challenge --
it's a business process challenge. Building the data link is a no-brainer
compared to the effort of keeping it up to date, and there's nothing about
J2EE or .NET that makes it easier to convince a warehouse manager to take
daily inventories, except maybe cheaper bar-scan devices. Bringing out new
technology to cut $3 million projects to $1.5 million projects is not a
revolution. Cutting costs and cycles by 50% is good and important and
ultimately... boring.
The revolution is in the bandwidth. More specifically, it's in finding ways
to more effectively produce a positive and consistent user experience
without consuming absurd amounts of bandwidth. And it's here that the
latest and greatest web services strategies from the leading enterprise
software vendors provide, well, nothing. In the end, they generate HTML and
send it over the wire.
They're widely compatible, marginally innovative. Big whoop.
I want to see the new technology that changes the user's experience. Not by
providing them with real-time inventory data, but by being 5 times faster,
10 times more responsive and available from anything from a web-enabled
watch to a Beowulf cluster on an OC-196. And at this point, that can only
come from one thing: a smarter client.
Browsers suck. Anyone who's ever built a web site can tell you that. We
accept them, because they're everywhere and they do a lot of useful things
and they don't really require much training and everyone's grandma
understands anchor links and input boxes at this point. But they still
suck. They have poor interoperability, they're riddled with bugs, they're
hard to upgrade, they have lousy security models, and worst of all, they're
really dumb.
Being stuck with dumb browsers leads to two basic approaches to complex
user experiences on web sites. Developers either 1) write huge amounts of
version-branching DHTML code that's drenched in string parsing and TABLE
tags; or 2) have small incremental screens that move data back and forth to
the server over and over again where all the behaviour logic is carried
out. Both of these approaches stink, the first because it wastes huge
amounts of bandwidth by sending a lot of unnecessary bits over the wire
(see
www.yahoo.com for a good example -- look at the source code), and the
second because it wastes huge amounts of bandwidth by forcing a lot of
incremental transactions between the client and the server. But at the
moment, they're the only way to create any user interactivity on the web.
Recently, we've started to see the inklings of a new way to build
interactivity. By using client-side XML/XSLT approaches, some developers
are consuming very little bandwidth while getting a very high degree of
interaction. Content is separated from presentation in the purest sense, by
sending a presentation interpreter independently of the underlying XML
data. What's generated is ultimately HTML, but it's HTML as determined at
the far end of the pipe, so it leverages the processing power of the remote
user and minimizes the bandwidth needed to produce a highly interactive
result.
What we could and should have are new web development systems that leverage
that client capability. The J2EE model draws diagrams that go: data server
-> business logic server -> presentation server -> client. Bah humbug. I
want a server that sends presentation, business logic and data ALL to the
client, and let's the local processor there do all the work of putting them
together. That way, when there's a repeat transaction, all I'm sending
again is the data. If there's a branch in the business logic, the client
handles it, and no information exchange is needed. If the user's running in
a smaller window size or a giant resolution, the client handles the layout
management according to the rules I sent in the first place. When the
transaction is ready, I want a tiny SOAP transaction returned to my server,
not some fat HTTP POST request that gets thrown into my CGI interpreter.
That's a revolution, and it doesn't have squat to do with servlets and
blade configurations. It makes increased scalability of servers irrelevant,
because the amount of logic processing and data massaging done by the
server is almost nil. The only thing you need a server to do at that point
is handle security and validation of transactions. We ask for more work
than that out of our cell phones.
How does this relate to Domino? Well, it so happens that Notes has been
taking this approach for over a decade. When you use the Notes client to
open a document, you are requesting both the document data, and the form
design from the server. The form itself is delivered as a separate
transaction and it handles -- wait for it -- presentation and business
logic! That's why it has controlled access sections, encrypted fields,
validation & translation, events, rich text layout, color controls, and all
those other great things you can do with a Notes form. And it's even
cached! So requesting the next document simply results in a transaction of
the simple data in the document, not a reinterpretation and redelivery of
all that logic by the server. That's one of the reasons Domino is more
scalable to native clients than browser clients. Views and other design
elements behave very similarly.