Meta trends - what have we learned?

Started by jaslar on 12/24/2013
jaslar 12/24/2013 9:45 pm
First, happy holidays to all. And thank you for making one of the most interesting communities on the web.

As I've (mostly) lurked on this site, I have observed several themes.

1. Information gathering. Many of you are roaming and reaping. Let's call that "research." You find fragments (text, image, other) of interest or relevance, and you want to mark and remember those things.
2. The arrangement of ideas. This is the interim stage of consideration, engagement and interaction. You toss snippets into buckets, then link and compare and test those snippets. I suppose it's really about pattern recognition.
3. The generation of new ideas. Ultimately, the intent of this activity is to create something new: to glean new meaning,
4. The publication of one's work. Here, the idea is to offer it back to the world.

I've been poking around the site the past few days, dipping into previous discussion, and do see some trends.

First, there does seem to be a push toward multi-platform tools. This is balanced with some healthy skepticism about life in the cloud (with services that may wink out, not to mention being mined by entities unknown).

Second, the interoperability of data is key. Platforms change, software changes, but many of the folks here assert the need to preserve previous work. (I myself have moved from CP/M to DOS to Windows to the Mac to Linux.)

I have some other ideas and observations, but I'm curious to hear from you. All of you are constantly on the alert for new tools. (That would be CRIMPing!) But what are you seeing about the evolution of those tools -- and the trends in information management?



Dr Andus 12/24/2013 11:28 pm
jaslar wrote:
Hello jaslar,

good summary and interesting analysis!

All of you are constantly on the alert for new tools. (That would
be CRIMPing!) But what are you seeing about the evolution of those tools
-- and the trends in information management?

I often find that the "new tools" I find are not necessarily new to the world but they are new to me, and that the interesting tools are not necessarily the most recently created ones. There are a lot of oldies that are still goodies out there, to be discovered.

Also, I'm not sure if there is always an "evolution" out there. With the rise of smart phones, tablets and web apps there is also a push for the mass market, which brings a certain amount of dumbing down with it. This may lead/have led to sophisticated applications being abandoned and developer talent diverted away from the kind of software we're interested in here.

Nevertheless, on the whole I'm optimistic, as I keep running into surprising and unexpected developments when I least expect it.

Happy holidays to all, and all the best in the New Year!
jimspoon 12/25/2013 2:24 am
sitting here with relatives who are watching "It's a Wonderful Life", which i already saw two days ago! So I am browsing about with my trusty laptop.

One thing I've noticed is the proliferation of outliner apps for iOS and Android, while I have seen very few new outliner - pim - notetaker programs for Windows. So it seems like there's a been shift from developing of programs for desktop os, to development of webapps and apps for mobile os. As Dr. Andus indicated this shift has involved some dumbing down. I suppose as mobile and web platforms become more powerful and sophisticated, we can expect to see more sophisticated outliner / pim / notetaking apps for these platforms.

jim w.


22111 12/25/2013 3:38 pm
"sitting here with relatives who are watching “It’s a Wonderful Life”, which i already saw two days ago! So I am browsing about with my trusty laptop."

That would have to be called "more general crimping", for input, too - well, that's what all those non-outlining people out there do all the time, right? (We just enlarged this general concept to being unhappy with unsifficient sw.)

Cloud, collaboration, multiple platforms

I don't want to appear negative here, I fully acknowledge that all this is going cloud in the end since that, yes, will greatly facilitate collaboration (at a price, but I shut up here). I simply would like to remind that Petko, who seems to be willing to spread MI to slates now, after all, could better concentrate on his core work, optimizing MI, had MS ever had the obvious idea to put some money into the development (together with the processor maker of their choice) of processors and "peripherals" by which your regular Windows application would run on a slate with both the weight and the battery consumption of an iPad or an Android device, so many, many developing hours are lost to work here that should have been unnecessary in the first place.

Text bits processing, "Atlantis"

Also, and this has been mentioned here in the past, what today's outliner offer with regards to text processing, both within their editors (intra-item) and within their tree (inter-item), is quite basic (or outright non-existant, for the latter category); both myself and "Dr Andus" insisted on the importance of such functionality (and of which you speak here, "between the lines"): re-arrangement of info bits, between/among those items, is not supported by most outliners (and which makes additional appeal for 1-pane outliners for some, and rightly so, or for traditional text processors having good outline functionality superimposed, as in Word (! with or without add-ins for that) or in Atlantis (of which Prof. Kühn raves, and with good reason, even when that fine sw lacks important functionality by other criteria, e.g. cross-referencing - in fact, you can do 10,000 pages files with Atlantis - people report to have done this successfully -, but then, any cross-reference there will only be done by external mark-up surrogates; no search result "tables" = lists either (which, again, would be so helpful in a text processor being able to process texts of any size) (a "table" would mean something more, showing meta data like "within which chapter/sub-chapter", a "list" just showing the immediate contexts of those "hits", and even a simple "list" would be of tremendous help here); but then, for mimicking a basic (an even not-so-basic) 1-pane outliner, Atlantis is outstanding sw!!!) With today's outliners, if you shuffle MINOR bits around alot (and academic workers necessarily do this), Atlantis might be a VERY viable solution (as for most of them, Word is, not unexpectedly), so all our crimping is caused by the inability of current outliner developers to listen to us (and yes, I do a lot with external macros to fill up those gaps, but it's all so unelegant when you have to do it from outside of the native code body of the sw you use).

Developers not listening as much as they should (but doing cross-platform translation work instead, sometimes)

No content here, I'll spare you my rant and shut up. ;-)

22111 12/25/2013 4:05 pm
I forgot the obvious: In Atlantis, no tables, and hence, probably no formulas, no good pic handling (but, as said, real good outlining functionality).

And I perhaps did not make my "little chunks of text vs. bigger chunks of text" observation clear enough:

We had the discussion here about 1-pane vs. 2-and-more-pane, and it seems the big interest of the 1-pane variety (or then, Atlantis/Word instead) lies in the ease of access and handle even minor bits/chunks, whilst our current 2-pane outliners 1) put, by their generic design, obstacles into that process, and then, 2) do nothing to overcome those obstacles (whilst there are many such possible little "helpers" imaginable) - just one very basic example: The majority of 2-pane outliners, when you switch forth and back between 2 items within the same file, do NOT even go back to the other item's cursor position, but to the begin of text of that item.

Thus, we have, in 2-pane outliners, two different concepts (bad, bad!) for pieces of content (that only vary by size and should hence be treated as similarly as possible): For chunks of text being big enough to constitute an item of their own, all goes well, perfect processing, here the 2-pane concept excels. And then, whenever a chunk of text is NOT big enough to "justify" an item of its own (be it a paragraphs, even several paragraphs, or even just a sentence or even less than a sentence), in our 2-outliner "workflow",

we get into deep trouble, our tool clearly gets into our way.

And of course, such problems should have been attacked years ago, from our side (thinking about it), and from the developers' side (by realizing our suggestions we might have brought in). It goes without saying that such problems will not evaporate in the cloud but have to be resolved, at least for academic writers who don't have such bits to shuffle around here and there, but on a "dozens of times a day" basis - and then, inter-item cross-referencing that remains valid for publication (and which neither any outliner nor Atlantis offer, but Word does, so no wonder the outliner community remains that select, in part by their in-house made "omissions", both on the user side, where the "asking" for these necessary elements is not strong enough, as on the developers' side, where such "academic basics" are considered to be just "too demanding").

Info crimping is one thing: There can never be enough info out there to irgurgitate, except for morons. But outliner crimping is caused by numerous, blatant omissions in this special field of the "industry": We have A Right To Be Unhappy! - omg ;-)

Alexander Deliyannis 12/25/2013 8:18 pm
Just over 5 years ago I wrote a brief 'wishlist' of features for my information management tools: http://www.outlinersoftware.com/messages/viewm/4514

Quite surprisingly (for me), the debate continues, even though I believe that the points I then identified are now confirmed trends.

There's more of course: interoperability has shifted to the web domain, with ecosystems of applications now growing around Evernote and Google Apps just as they once grew around Outlook. Computer use is still expanding and, as a result, programmes are becoming more 'intuitive', often at the expense of capability; I consider the trend towards mobile apps in this context, simply because I consider smartphones and tablets computers. Everything is becoming 'social', everything should be 'shareable', at the same time that privacy concerns are growing.

Last but not least, a growing number of users--be it age- or culture-related, or both--expect their tools to be offered for free. I wrote elsewhere about this so I will not elaborate further here. But is it any wonder that most independent developers are not prepared to take major risks, by investing in innovative, powerful and--consecutively--complex applications?
Stephen Zeoli 12/25/2013 10:44 pm
This is the first thing that occurred to me, too. Can you imagine a programer undertaking the development of an application like Tinderbox today? Tbx is a multi-dimensional tool that runs on one platform. The trend is to make focused (i.e. limited) apps that run on multiple platforms.

Steve Z.



Alexander Deliyannis wrote:
Last but not least, a growing number of users--be it age- or
culture-related, or both--expect their tools to be offered for free. I
wrote elsewhere about this so I will not elaborate further here. But is
it any wonder that most independent developers are not prepared to take
major risks, by investing in innovative, powerful
and--consecutively--complex applications?
Alexander Deliyannis 12/26/2013 8:30 am
jimspoon wrote:
So it seems like there's a been shift from
developing of programs for desktop os, to development of webapps and
apps for mobile os. As Dr. Andus indicated this shift has involved some
dumbing down. I suppose as mobile and web platforms become more
powerful and sophisticated, we can expect to see more sophisticated
outliner / pim / notetaking apps for these platforms.

I don't think that the lack of sophistication of software has to do with the lack of power in platforms, at least in respect to the web. One can do anything on the web; at the end of the day, they are client-server applications, with server platforms being much more powerful and sophisticated than their desktop counterparts.

I really think that the issue here is the business model. Privately developed applications are becoming more 'dumbed down' because their owners/developers can't afford the risk of making them more sophisticated, whereas those who have the means invest in what they believe can become mainstream.

Steve Z. provided the excellent example of Tinderbox. I would add Zoot: consider its sophistication (it's actually many applications in one), greatly due to the connectivity expected in the contemporary context; I can only sympathise with Tom Davis.
jimspoon 12/26/2013 7:09 pm
good points Alexander.

I'd like to formulate some good thoughts about it right now but the brain isn't cooperating.

But I will throw in one thing that interest me - look at the prices of mobile apps. $0.99, $1.99, $2.99, $4.99 etc. I guess simplictiy of the apps = low development costs, and with high volume the development of such apps can still be profitable.

I guess for complex desktop programs, the volume is relatively low, and the development can only be profitable if a much higher price is charged. Still, it does make me wonder if developers of complex desktop programs would actually maximize their profits by offering them at much lower prices that would stimulate greater demand.
Franz Grieser 12/26/2013 10:35 pm
jimspoon wrote:
...
I guess for complex desktop programs, the volume is relatively low, and
the development can only be profitable if a much higher price is
charged. Still, it does make me wonder if developers of complex desktop
programs would actually maximize their profits by offering them at much
lower prices that would stimulate greater demand.

Hm. I'd say the tools we discuss here are niche products. Though almost all of the readers of one of my newsletters (covering Microsoft Outlook) have a licence for OneNote, only a small percentage actually uses the software; they do not even use the notes feature in Outlook. A number of journalists and writers I know use Evernote for collecting notes - it's free and multiplatform, and all they know. But beyond these information workers: the big void.

There simply is not a mass market you could address by offering complex applications at a cheaper price.
Steve 12/27/2013 1:00 pm
Food for thought about useful productivity programs that take time to learn; "18 Epic Productivity Apps...." http://www.coolcatteacher.com/best-productivity-apps/
I found the link via http://researchbuzz.me/

Looking over the list, all seem to be focused "apps" that are free or close to it. The other thing I noticed was the lack of a way to organize all that "stuff' and find it later.

Steve
Dr Andus 12/27/2013 2:45 pm
jimspoon wrote:
But I will throw in one thing that interest me - look at the prices of
mobile apps. $0.99, $1.99, $2.99, $4.99 etc. I guess simplictiy of the
apps = low development costs, and with high volume the development of
such apps can still be profitable.

But it raises the question of what exactly is being sold here. And it seems to me that the focus is more on selling the hardware than the software. E.g. the amount of money I spent on iOS apps in the last 3 years is dwarfed by the cost of buying my iPad 1.

However, one by one, the apps I bought are dropping support for my iPad 1 (which is many other respects still works fine), forcing me to shell out a significant amount of money for my next iPad once more.

In contrast, I also bought a top of the range PC 3 years ago (Win7, 64-bit, 8GB RAM), and I feel no need to replace it whatsoever. Since PCs have become mature and reliable products, now we are forced into a planned (?) obsolescence game with the tablet manufacturers, or rather, the OS developers (Apple, Google, and MS).

I guess for complex desktop programs, the volume is relatively low, and
the development can only be profitable if a much higher price is
charged. Still, it does make me wonder if developers of complex desktop
programs would actually maximize their profits by offering them at much
lower prices that would stimulate greater demand.

The problem for traditional desktop software developers is that the whole nature of the game has shifted. It's becoming a complex calculation/guess work as to which business model to go for, both in terms of platforms to develop for, and licensing regimes and pricing strategies to go for. My guess is that the successful ones will be those who will be able to develop an eco-system of sorts, with support for multiple platforms (inc. mobile), syncing, web access, and possibility for other developers to write scripts, add-ons etc.
jaslar 12/28/2013 3:52 am
Fascinating link. Thanks. A powerful search is certainly important. But on contemplation, I have three more observations. First, although I want things to get more organized, I've noticed that it's possible to get so organized that I no longer enjoy my life. There has to be room for some spontaneity. This may be one of the reasons I am not a (full time) teacher. Second, although I'm happy to jump around a few program while I produce something, it matters to me that the products wind up in one place. For me, that ultimate filing system is Notecase Pro (back to searchability). Third, simplicity (streamlined UI, plain text files, ubiquity on multiple platforms) has risen as a value for me. I get why people like Evernote. But Simplenote seems to be edging it out lately. I just want to capture the thought, tag it, and be able to retrieve it later.

There comes a point where the complexity of a system overwhelms the work I do with it.
MadaboutDana 12/30/2013 11:02 am
Great discussion! But I think we're overlooking - or perhaps misidentifying - a trend I've just mentioned in the iA Writer thread. Touch apps are not always dumbed-down. Sometimes they've found simpler ways of doing things that were formerly over-complex. I think that's an interesting and worthwhile trend, and is being reflected in newer desktop applications, too - especially on MacOS and the web. What the touch platforms have highlighted more than anything else is the widespread dislike of "fiddly" interfaces.

Efforts to produce streamlined, non-fiddly interfaces can, of course, go too far (on this subject, it's interesting to review reactions to iOS 7 over the past few months). But I get huge satisfaction out of using an app with a genuinely straightforward interface that's also elegant and powerful. In the outliner world some of my favourites are on iOS (Notebooks, OneNote, Outline+, Notability). But there's still a hurdle very few apps have overcome - touched on by Jaslar's reference to Simplenote. It's difficult to reconcile immediate availability with sheer power. A feature-filled app is always going to take longer to load than a streamlined single-function app. And one of the huge benefits of touch environments is their immediacy.

Which brings me to another trend we haven't really discussed: sharing of info between apps. I'm a great cross-platform fan, of course, and am looking forward to interesting developments in this area. But it's also interesting to watch the trend for info sharing between apps on the same platform (especially iOS). The "walled garden" approach has called forth considerable ingenuity - it's well worth visiting the brilliant MacStories blog by Federico Viticci (http://www.macstories.net/ for some thoughtful discussions of the best solutions. I think this trend will continue and accelerate - it will be interesting to see if this encourages Apple to do something they've been urged to do for long time now: create a shared files area. If they don't, existing solutions like Dropbox (and Box, with their recent Editor app) will undoubtedly continue to evolve into sophisticated app ecosystems (another interesting trend) that are also cross-platform.

My own dearest wish for 2014: a truly cross-platform note-taking app that's fast, efficient, allows hierarchies, tagging and (highlighted) searches. Simplenote's okay, but it's not great (yet).

Interesting snippet of news on that score: Apple appears to have bought out Catch (http://techcrunch.com/2013/12/23/apple-reportedly-acquires-note-taking-app-catch-broadmap-talent/ That was a nice little app. But I'm more confident in developers like 6Wunderkinder for full cross-platform solutions - Apple tends to restrict its activities to its own platforms.
jaslar 12/31/2013 5:04 am
Yes, thank you, that's one of the things I wanted to say. I want things to be just complex enough to accomplish my few tasks, but not so complicated that I forget what I wanted to do, and can no longer remember where that function might be. I've thought for a long time that Gedit (Linux Gnome text editor) with an outliner plugin (expand, collapse, drag and drop) would take care of most of my computer needs.

But Gedit is free. And as several folks have pointed out, why bother developing something free for a niche platform? I've been waiting a long time for that modest little plugin.

So I think it's true that the market constraint is to develop a small set of features, compellingly presented, that can be quickly distributed across many platforms for a small amount of money. The same kind of thing is going on in the world of publishing: small, independent authors getting works to ubiquitous platforms, selling them cheap, but sacrificing a bit of review and quality control. In the process, offering immediacy for the smartphone generation. But also, perhaps, making a little more money per person because the overhead is lower. But also also, maybe refining interfaces in a useful way, excising the cruft of over-elaborate design.
22111 12/31/2013 11:27 pm
"The problem for traditional desktop software developers is that the whole nature of the game has shifted. It’s becoming a complex calculation/guess work as to which business model to go for, both in terms of platforms to develop for, and licensing regimes and pricing strategies to go for. My guess is that the successful ones will be those who will be able to develop an eco-system of sorts, with support for multiple platforms (inc. mobile), syncing, web access, and possibility for other developers to write scripts, add-ons etc."

I am very critical about the current MS slate offering, and they should have kicked out Ballmer 10 years ago... but then, he is out now at least.

I simply cannot imagine that MS slates will remain in their current shape, i.e. either ridiculous weight (too much) and ridiculous battery life (too short), OR not being Windows compliant (= their ridiculous RT series).

Unfortunately, I did not put my money into Apple when their very first iPad was announced/released, because I wrongly thought then that their incompatibility with their Mac OS (iOS or whatever it was then, 3 years ago) would hamper its sales, prospects wanting to wait for a compatible offering, either from Apple or in the Windows world - as you will remember, Windows slates were available but very thick and very heavy.

Now time proved me very wrong, but my original idea cannot be that wrong: If something compatible is available, users will prefer a slate where there will be NO "translation" of data but where they will have their "original" pc programs, and their original data to be processed, just as in ancient times data shuffling from pc to notebook and back (which is complicated enough).

So what Alexander says in the above citation will again be partly overturned in the moment acceptable Windows slates will reach the market - they will not necessarily come (and will preferably not come) from MS, but they will make available regular Windows applications on your slate, and many people will buy slates then, both (like I) as new slate users, and people who will then have had 1 or more slates in other systems - as we know, today, many people use a pc but then an iPad, too.

I think this is so aberrant that most of those users will return to Windows on two platforms, office and slate.

As for cloud services, of course they will become predominant (since 90 p.c. of people don't bother who has access to their data), and I am aware that this is another core factor in this game since it will weaken the aforementioned effect of the user now being able to use the same data, in the same way, on both his desktop pc and his slate.

Personally, I hope that acceptable slates will come very soon, but I acknowledge that if they do not, it might be too late, most not-heavyweight pc applications (i.e. graphics, video cut, cad, etc.) will have been replaced by cloud applics by that time.

If this is about to happen, you all can be sure no data will ever be safe again, and no applic can be bought, it will be rent, and yearly subscriptions for everything.

I know that the cloud, the web, is very practical for many things, but overall and for most people, it's a honey pot. Today, you have a choice; if developers are driven to cloud applics (and they will if they see that those are perfectly accepted, both by the data safety aspect and by the financial aspect), many people will mourn over the pc age that will then be lost (with elderly pc users just continuing to use legacy sw up into their old age).

At the end of the day, for developers, web applics are a much better business model than pc applics were, and most users are eager to switch to that business model which in the end is to their disadvantage.

Long-term, the game is over, we know its outcome; it's just the very next moves that might be of interest to some of us.

Just these days, the ex-Ebay woman that now rules HP had announced she will not have kicked out another 26,000, but another 34,000 staff from this once first-rate company.

In fact, the business model I prefer, and to which HP and MS clinged to, in the past, is already dead today (and MS, with Office 2013, already switches horses).

The web, the biggest honey pot in history, for the masses, and for smarter people, they put some little honey puts into the big honey pot, in order to get them glued, too.

I think I begin to get why no serious developer invests any more buck into pc applics.

The irony here being that those "dumb workstations" from some years ago, to be connected to the corporate server instead of regular pc's, never took off, but that now, very thin and light slates will more and more be the "dumb but perfectly sufficient end device" for more and more sophisticated web applics, which will bury the pc for good: Even cad will be done on (bigger) iPads, since the respective sw houses will not upgrade their pc versions but by the corresponding web applics.

And in 20 years, they will implant chips in your brain, and in 25 years that will become mandatory.

22111 1/14/2014 1:32 pm
Dirge, continued

Today, on news sites they tell you that Google buys "Nest" (this would also be German for "nid", "home") for 3,2 billion $$. The current owners say that the authorities will only be allowed to put in special add-ons in those net-connected smoke detectors, etc in special cases; Google will certainly adopt a broader policy for non-smoke data to be transferred to the web, too once such data transmission will be possible without making too much noise; spiegel.de thinks that "Google wants to get entry into your bedroom", when in fact it's in the office that the really interesting things are discussed today.

From this, I got a link to a 2-months old article, http://www.welt.de/wissenschaft/article122156237/Windows-10-laeutet-das-Ende-der-Freiheit-ein.html which refers to xbitlabs.com and which muses about Win 10 (what about 9?), and that comes of interest in light of the above since one more time, reality seems to go farther than imagination (ok, it's not reality yet but very plausible indeed). The Welt article says that closed systems like iPad, Kindle, etc. are seen as role models today for any development there is to come, and that even in Android, makers try their best to make that open system as closed as they can, in order to get hold of the respective sw market of their respective hardware.

The article says that MS will design Win 10 - attention, we are making the switch from sw to the os here, and that's the real interest in this post - in a way that the user will need the web to run his sw. In the post above, I spoke about notebooks and slates as "dumb but perfectly sufficient end device[s]" for sw, running from the cloud, but I also had in mind some people would be able to choose to continue to work with their old, non-cloud sw on their notebooks at least, and this MS move to replace even Win with a cloud Win means that they try to hinder exactly this choice of yours; of course, to do the full step instead of doing it halfways-only is very logical from their pov.

So what does that mean in practice? Technically, it will be possible to destroy old Win versions as soon as your device (let's say an old notebook running with Win 7, the last acceptable version) the second you imprudently connect it to the web (e.g. for just browsing some site for urgent info, whilst regularly, you would use some other device for such a task); technically, it's also possible that e.g. Win 7 is self-destroying, e.g. by Jan 1st, 2019 or such, and/or in combination with total absence of net connection of that devince = that specific Win installation, e.g. self-destroyal will be withheld for some 2 years on every net connection (allowing MS access to that installation), and of course this can be applied in combination: For the time being, self-destroyal is reported but such connections, whilst any further connection will instead trigger that destroyal.

On the other hand, all those Win versions, at least from XP on, cannot be installed for running on machines (old or new) without the placet of MS, i.e. it will be of no use to buy old machines with old Win versions, in order to have your current sw running, since in any case, MS = "they" will be able to prevent you from using the os you need for that. (This arises the question if there are virtual Win installations on Macs that do not phone home, but then, the Apple os itself will very proably do that, so you'll be quite in field 1 here.)

So this means that contrary to what I had thought above, you cannot even be sure that you will be able to use your old sw on old machines (and it safely can be predicted that in some transient years, XP/Win7-running laptops will cost 10,000 $$ on ebay and such), but in the end they will simply take away any non-net os from you, and then Win 3.1 computers will become in high demand.

From another pov, we're currently in the very last years of pc "standardization by os", i.e. not only the net is to be considered the biggest honey pot in history (since it didn't drive people to it but by their own will, for years, before becoming mandatory in some time from now now), but the MS Win system has to be considered as a major honey pot in its own right since we all (?) thought we were "safe with it" in the way that there would be "development", causing some incompatibilities here and there, but we weren't told that some years later, every foundation of that system would be taken away from us, and that WITHIN that very system, there would be a total paradigm shift, FROM "pc" (which means "personal" in the sense of you being the owner of your data, at your place) to "dumb end device, not even storing your data anymore, let alone the sw, and even the os is now just that part that will make your device run the real os on THEIR servers" - if people had foreseen such a development, many of us, I suppose, would not have touched Win to begin with.

From another pov yet, the move of developers to cloud applications now does make even more sense since in some time, they cannot even count anymore on the presence of some "Win" os on the systems their sw is expected to run on, so they are literally driven to program for the cloud since otherwise it's foreseeable that their programming efforts' "product" lifetime will be rather short, so that they cannot recoup their (time and/or money) investment.

And from another pov yet, it becomes even easier to understand why people like TB developers use an "inferior" programming language like Java for their programming today: It's the user who has to live with the drawbacks of such a choice, but for the developers, that choice opens up much broader current marketing possibilities, without them having to shelf large quantities of expensive code rather soon.

So, in the end, it's not only users' willingness to rent net sw whilst this marketing scheme never really rolled with individual users at least in the (now expiring) pc age, but even more important, the underlying os necessary for any sw to run will be taken away from users and from developers, too, in a foreseeable future.

Of course, there are "alternative os", but first, they will never be adopted by individual users, and second, that piece of sh** we're speaking of here, Linus, never resolved that core problem how to run Win sw on it, and will probably never be able to do so - from another pov, this opens up a big business alternative to anybody who'd be able to create an alternative op that just mimicks today's Win, without MS being able to legally stop this, and such allowing for the preservation of any current Win sw when the original os versions will be taken away for good.

But of course, most individual users will then have completely switched to "all-net" (and so there does not even seem to be a very big market for such an os development), notwithstanding the fact that by then, not a second of computer work will be done without "them" knowing down to the slightest detail what you are doing - let alone where you do it.

All this because most people (and this includes many smart people; we're not speaking of those here that cynically apply their better thinking to serving the "rulers") ain't "creational" in their thinking, but just consider what is there, which means they just let flow their thoughts within the confines of what's presented to them, the irony here being that the net, as a honey pot (for the time being, i.e. it's not a crime yet do your pc work), deludes people into some ostensible "freedom" that pretends to open up for them.

So as we see here that the cloud's problem by far exceeds just "where's your data, and who might become access to it" considerations. Of course, for corporations, there will be arise lots of problems once they also will be taken away Unix and their derivates, expect for the "officially-authorized" ones (Apple and such), and this point in time will come sooner than optimists might imagine today.

And don't fool yourself: Hybrid sw like Surfulater NexGen will lose their hybrid character the second their developers will judge that now the moment has come to take away from you the pc part of the code - just this moment, they must be quite exasperated to have to do all this "unnecessary, additional" pc part coding, too, in order to not lose their customer base now, when that quite-identical customer base (with some old people having left in the meanwhile, but many young people having joined the lot) will certainly not cause any more trouble, as today the do, in some 5 or 6 years.

And of course, any discussion of e.g. editing details for shifting around pieces of text within the data base of a 2-pane outliner might be considered quite irrelevant, in the above-described larger context of what we do today.

For some of us, the pc wasn't available yet when we were young (so we remember those index cards in real horror), and the pc will not be available anymore when we'll be really old (or dead) - and this, very ironically and very precisely included the pc concept first of all, incl. notebooks, "personal" data on a slate/"smart" phone, whatever - they simply take away the "personal" from the "computer", and from your life (cf. Google soon in every home/office, on the ceiling: in an introductory phase, they will hide their cameras and all, in a subsequent phase it'll become a crime for you to cover their devices). Everything that can be done, is done. Surreptitiously first (the honey pot), then generalized and worldwide in the open ("1984" considered natural). And yes, next step then will be that they open babies' brains to inject the "necessary" devices there in the very seconds after birth, and btw, what about intra-uterine interventions?

Well, you couldn't have seriously assumed that they would stop their control efforts after having realized that their subliminal messaging in the cinema theatre hadn't shown its hoped-for effects, could you?