Some Kind of Forum
People get together to resolve big problems. Then nothing.
With Miroslaw Manicki
There is a big problem. We discuss this in our book “Continuing Bali G20 Success: Knowledge to Digital”. People get together. Smart people. Committed people. They share what they know in great conferences and convocations — using documents, media, infographics, and the like.
They socialize. They celebrate. They return home.
Nothing happens.
Why doesn’t anything meaningful happen? Because the people in question do not have the capacity to take what they know any further. They don’t have the tools. As described by Manuel Castells in his 1996 book “The Rise of the Networked Society”, they do not have control of the “switches” to extend their knowledge in useful ways, so people can make use of that knowledge (471).
Thus, there is a big gap, as can be seen below.
Why don’t information technologists work to close that gap?
It doesn’t bother them. They exploit the gap. In large part, that is what the latest barrage of promotion about artificial intelligence is about. We have written about this.
The question has been considered by some.
What would be the nature of the study and practices of technology implementation that would qualify these individuals to be the judges of all? Feigenbaum, a leader in the computer science field, has said, “We could not break the knowledge acquisition bottleneck” (Tingey, 2014, 51).
It is important to note that the AI effort has involved “acquiring” knowledge from experts in one step, deploying it later by themselves. They have never contemplated direct empowerment of such experts.
Famously, many of the technology leaders, AI and otherwise, elected to abandon their own educations because of their moneymaking prospects. They had coding skills; they were able to create software for the most part using existing tools. They found themselves in a position to inherit the power and credibility of IBM and other existing companies with good reputations but that could not pursue the personal computing opportunity due to antitrust laws. This was IBM’s problem.
I had personal experience in this. I was a venture capital partner of the Ventana Growth Fund in Southern California in the mid-1980s. I was assuming the CEO role for one of our portfolio companies that was having trouble getting started. The company produced interactive medical procedure systems that allowed doctors and nurses to carry out critical care procedures using actual charts of cases from University of Southern California Medical Center, adding video vignettes for reality.
The system used touchscreen technology, and we entered into a joint marketing venture with IBM. I led a team that visited dozens of IBM Customer Centers around the country, demonstrating the system and contracting for more business.
In the process of setting up the partnership, I spent several days in the unit headquarters to clarify our contract. I was working with a room full of IBM employees. I had sent a draft letter that included the word “exclusive” and as the morning drew on, participants in my room went over to a room next door, to the point that that meeting became full, and our original meeting room involved only me and my primary IBM contact.
The other meeting was being run by IBM attorneys. They were trying to find a way to deal with the word “exclusive”, which was forbidden in their deal with the Department of Justice. They figured it out, we created a paper trail that clarified that the reference came only from me, and the project went on.
The point is that Microsoft in particular benefitted highly from such conditions. They were able to inherit IBM’s credibility in personal computers and it was like shooting fish in a barrel, as we Americans would say, from that point.
They didn’t inherit the same sense of the enterprise and the same research and implementation commitments of IBM and other deep research efforts from the 1960s onward. Steve Jobs committed Apple Computer and Macintosh computers to the desktop metaphor, the mouse, etc., because he thought it would make computing fun. Microsoft and the others followed suit. A barrage of shiny new tools replaced many mature, secure, and highly functional existing applications.
The “gee whiz” factor continues to be prevalent in that world, from Apple Computers’ “daylight raid” of Xerox PARC Laboratory’s graphical interface in search of making computing fun to decades of unfulfilled features and breathtaking inefficiencies (Waldrop, 2001, 440–443). It was said, “What Andy giveth (in new computing power by Andrew Grove and Intel), Bill taketh away (in hugely resource-intensive software storage and computational requirements by Mr. Gates and Microsoft)”. The personal computing world grew out of give-and-take of that kind, where also announced features tended to only present themselves in subsequent versions of the software.
The “gee whiz” factor fueled bigger and bigger numbers going into and coming out of Silicon Valley for decades. There were a couple of notable correction points along the way, the dot com crash of 2000 and occasional market corrections, but the “fun” revolution took them a long way. Then came the era of unicorn investments, when very large sums were invested at levels that valued the ventures at billions of dollars, with little else to call on (Griffith, 2024). The dream from before was that billions might be brought out of them, not in them. Something had to give. One venture capitalist declared in 2011 that “software [would] eat the world” (Ibid). You can’t get any more gee whiz than that.
The good times ended with the failure of Credit Suisse, followed on by the failure of Silicon Valley Bank (Beltran, 2023). The unicorn companies started to pile up and there was nowhere to go from billions committed up front (O’Keeffe, 2023). Gee whiz finally became “Cheese Whiz”. The Unicorns had nowhere to go. In fact, many of them did not even qualify as going concerns, as real businesses. They parked a great deal of money in local banks. Then came the run. From March 8 to May 1, 2023, just two years ago, the slew of banks in question were shut down and management was taken over by JPMorgan Chase.
There was a ‘cleanup on Aisle 9’ of Silicon Valley Bank and Republic Bank — the third largest in history — and Credit Suisse and other investment banks retrenched. Credit Suisse became a subsidiary of United Bank of Switzerland, but its $60 billion of bonds were not honored by the Government of Switzerland. That channel hasn’t dried up, but it is just a trickle compared to the money that they need to leverage on, and put to work, to boot. There are record levels of “dry powder” in the industry, money that needs to be invested.
They needed something. It had to be big. It needed to introduce information processing Gee Whiz at a whole new level. Money is still parked. There needs to be a new round, one that will fulfill the ‘prophecy’, to “eat the world”. Enter the AI phenomenon and all its glory.
It is a stretch, although torrid levels of promotion hide the fact. Bill Gates described the problem in a television interview on September 9, 2023.
Bill Gates: Like everybody who’s involved with this, the way that it’s actually representing knowledge we don’t fully understand, we know how we trained it and made it guess and fill things in. We know how it figures out words and speech, but the fact that it’s so good, the exact specifics of where, how it’s storing. Things we’re still researching...
Ari Melber: So we teach it, but then it teaches us.
Bill Gates: Yes, but it’s not perfect yet. Yeah.
Ari Melber: Well, that goes to something that you and others have warned about. You mentioned the meeting with the Senators. It seems there’s a version of this where humanity gets it right. We use this for good things.
Ari Melber: You just spoke about that…
Bill Gates: If this technology goes wrong, it can go quite wrong and we want to be vocal about that. We want to work with the government to prevent that from happening (Ari Melber Show, MSNBC, Sept. 19, 2023).
Really, are we going to do this? And why? Why are we in a hurry? Did people stop thinking? There is the great divide between knowledge and networked use of that knowledge — the point of this article.
Mr. Gates says he and others are eager to learn from the AI computers. He says that they do not know how the AI computers make their choices. He and the others are clearly comfortable with that.
Is anyone else? Truly, great effort goes into knowledge-seeking. There are quantitative methods for this. There are qualitative methods for this. Had Mr. Gates attended school at Harvard for more than a few days, he wouldn’t even have gotten to research methods, probably, until graduate school after four years of basic study. Famously, he read a good deal as part of his career. That is not the same. Introduction to knowledge-seeking at the university level is just the beginning. People spend decades of intensive effort in such activities. That results in the vast pool of human knowledge that we need to benefit from.
Here are two figures that demonstrate aspects of quantitative and qualitative study. Gifted scientists and practitioners spend lifetimes considering even one or two of these methodologies while as you can see, there are dozens of factors (Bogdan and Biklen, 1982/1998).
Obviously, computers can compute. They can calculate. Does this mean that they can make the kinds of judgment calls needed to establish concordance with the requirements of quantitative data stores? These are very difficult to come by (Campbell and Stanley, 1963).
It is highly questionable that computers can support qualitative studies of many kinds. These involve human traits and abilities that extend beyond computational skills — judgments, sensory perceptions, sense and emotions themselves.
What’s the rush, guys? Let’s catch the ball before we run. Let’s organize and reflect our knowledge, qualitative and quantitative, in useful digital form. Then we can see about your “hot money” problem.
Back to “some kind of forum”
As described in the following book, the best answer to deploying knowledge by experts themselves according to one group is that we need “some kind of forum” after conferences and publications to resolve this problem.
They don’t call for ‘this’ kind of forum or ‘that’ kind of forum. That would in fact connote advancement of a kind. We see here a kind of wish, an empty notion.
What is the price we pay from this unresolved problem? What price will our children pay, and their children, if it is not resolved?
Look into it. We don’t want to know what the costs from failure to launch from a cognitive standpoint. We certainly do not want people to have to pay that price over and over again — with potentially terminal consequences as to the way we live and prosper.
Fluidity is an important aspect of the solutions. It is considered in the books above. If you want to know more, investigate the Fluidity Library below.
References
Beltran, L. 2023, March 25. How Credit Suisse redefined the IPO market during the Dot-Com boom — then lost its grip and spiraled into scandal
Yahoo Finance. https://finance.yahoo.com/news/credit-suisse-redefined-ipo-market-120000350.html
Bogdan, R. C., and Biklen, S. K. 1982/1998. Qualitative research for educaton: An introduction to theory and methods. 3rd ed. Needham Heights, MA: Allyn & Bacon.
Campbell, D. T., and Stanley, J. C. 1963. Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin Company.
Castells, M. 1996. The rise of the network society. The information age: Economy, society, and culture, Vol. 1. Malden, MA: Blackwell Publishers.
Griffith, E. 2024, December 13. What is venture capital now anyway? New York Times. https://www.nytimes.com/2024/12/13/technology/andreessen-horowitz-benchmark-venture-capital.html?smid=url-share
O’Keeffe, D. 2023, November. Herd on the Street: So many unicorns, so little cash: Billion-dollar start-ups are common these days. Bain. https://www.bain.com/insights/herd-on-the-street-so-many-unicorns-so-little-cash/
Tingey, K. B. 2014. The angels are in the details. Control and regulation…in a good way. Logan, UT USA: Profundities LLC. https://a.co/d/gEv9D9C
Waldrop, M. M. 2001. The dream machine: J. C. R. Licklider and the revolution that made computing personal. New York: Penguin Books.