While you’ve are enjoying the weekend, the brain cells here have been working overtime on various aspects of the “Futuring” problem that was discussed in Peoplenomics Saturday.
Without giving away the whole story the problem for most of us is how to collect and assimilate sufficient information (when making investment decisions) so that we are not investing our life savings in a crooked game.
Let’s just say the term “crooked” means investing where:
a) Insiders have significant advance information or
b) When pending orders are used by high frequency trading outfits to “front-run” trades and skim a bit off every legit trade or
c) When significant information asymmetry is involved in release of information such that only certain classes of investors know about pending price moves, management decisions, or other key variable before us “dumb” investors.
Grady (chief coder guru of our www.nostracodeus.com project) and I have been slapping this problem back and forth this weekend and we’re coming up with some novel ways to approach the knowledge engineering side of things.
One way we can do this is to “bit bucket” all currently scheduled events by using time domain statements in news stories to sort them.
Let’s say we took all news stories that have the words “this Wednesday” in it. We already know from our news “tickler file” what some of the items will be making headlines by the time you get done with work Wednesday. Hell – half that newscast on Wednesday’s drive home is in the can right now.
For example, the Fed decision will be announced about 1:30 Texas time. And that will be preceded by the release of petroleum reserve data a few hours before that, the Case Shiller Housing data on Tuesday morning.
But it goes deeper than that: We know, for example that Bernie Sanders will have a Washington State meet-up and both Calloway Golf and Facebook will report earnings Wednesday and that will give us additional insight into how people are spending their spare time these days.
Of course, the art of the future is being right over the long haul. Ures truly warned in December of 2014 (back here) that social media wasn’t going much of anywhere. and sure enough, in April of 2014, SOCL went to near present levels, then again in July of 2014, and then in November of 20914, and quick- look surprised – we are there again…
Which makes it hard to just park some money – you are being squeezed into a trade to win regimen and that, in turn, makes you hamburger for the high-frequency traders.
All of which is why we spend so much time pondering the future and working on how to beat it.
We’re thinking about an adaptation of the Delphi Technique for a limited set of participants (like Peoplenomics.com readers) because there seems to be some magic in using feedback to develop a keen sense of what’s going to happen – before it does.
But even the Delphi Technique, which is described in Wikipedia this way…
“The Delphi method (/?d?lfa?/ DEL-fy) is a structured communication technique or method, originally developed as a systematic, interactive forecasting method which relies on a panel of experts. The experts answer questionnaires in two or more rounds. After each round, a facilitator or change agent provides an anonymous summary of the experts’ forecasts from the previous round as well as the reasons they provided for their judgments. Thus, experts are encouraged to revise their earlier answers in light of the replies of other members of their panel. It is believed that during this process the range of the answers will decrease and the group will converge towards the “correct” answer. Finally, the process is stopped after a pre-defined stop criterion (e.g. number of rounds, achievement of consensus, stability of results) and the mean or median scores of the final rounds determine the results.
The Delphi Method is not to be confused with a related technique for manufacturing consent in which an organizing party combines the input in a non-transparent way, giving the organizing party complete but non-obvious control over the outcome. A name often used for this deceptive use of the Delphi Method is the “Delphi Technique”.
Delphi is based on the principle that forecasts (or decisions) from a structured group of individuals are more accurate than those from unstructured groups. The technique can also be adapted for use in face-to-face meetings, and is then called mini-Delphi or Estimate-Talk-Estimate (ETE). Delphi has been widely used for business forecasting and has certain advantages over another structured forecasting approach, prediction markets.”
…has some limitations, so that’s more thinking.
This is one of those “Anyone can build a pyramid” problems…it just takes a lot of thinking to get it does with least effort and in today;s word that means all computerized.
The problem, once you get a sense of what’s scheduled for a particular day – like next Wednesday when we also know that Russia is likely to veto a UN tribunal on the MH-17 shoot-down – is in sorting out the thresholds where news will – or will not – actually move the market.
Here’s where it gets tricky: How do you turn events into numerical values that can be calculated and integrated against other data on a real-time (or NRT) basis?
Within the market it’s easy with endogenous factors. Those are numbers that just naturally fall out of the movement of markets – things like the MACD *(moving average convergence-divergence indicator). Calc’ed on the fly – no biggie.
But the point we got into in Saturday’s Peoplenomics discussion was how do you handle exogenous events or quasi-data events?
We actually got a very good start in our April 1, 2015 Peoplenomics report “Should you build a Home Information Platform” because there are some obvious ways to get ahead of the news pack. It’s just a bit “processing heavy” compared to the economic returns. And we do so love to optimize when it drops money in the pockets…
But here’s another way to look at the exogenous problem: A reader offers this:
How do you put a measurement number on an exogenous factor?
Easy, I think at first blush…
“Events” drive the market IF they’re big enough to poke above the residual noise floor. Smaller events don’t matter ina short run.
So, the “size” of the “event” needs to be measured. That’s easy, too, I think.
Here are the sizes, in my humble and flawed analysis:
NAME / SIZE WHO IS INTERESTED TIME FRAME EFFECT
1) Stories — Piddly routine stuff. Discussed by TeeVee talking heads and a few news junkies around the office water cooler. Effects barely noticed, and ephemeral. Time frame is a few days at most.
2) Big Stories — Stuff most people hear about — but often don’t really carevvery much about. Widely discussed, mostly by specialists and experts, but gone in a week or two. Quick and mild effect on “The Market(s).”
3) Really Big Stories — Stuff everybody hears about and discusses. Worthyvof lots of TeeVee coverage. Letters to editors, call-in talk show grist and suchlike. Often labeled, “Controversial.” May one day be an element in a documentary film. Might make the history books as a footnote. Lifetime is a couple of months to a year or so. Often slow-onset, but high stamina material. Measureable and somewhat lasting effects on “The Market(s).”
4) Gigantic Stories — World Trade Center I (1993). Many days wall-to-wall TeeVee coverage. Laws change. Daily life procedures change, large impact on “The Market(s),” multiple month to a few years effects — but ultimately gets resolved, or fades away. A couple of paragraphs in history books. A few documentary films treat it as the central topic.
5) Stupendous, Amazing, Titannic Stories — World Trade Center II. Fukushima, JFK, Same as #4, just much more so. As with Fukushima, these often develop — an initial story seems over; but then it gets a lot stronger over some little time, and becomes a S.T.A.T. level story. Effects: decades or longer.
Large impact on Market(s) — panicked silly selling. Bounce back slow.
6) Mega Stories For The Ages — Discussed by historians for decades if not centuries. Presidential assassinations, major disruptive technologies, the Return Of The Lord, — and nothing is ever the same after. Onset is usually extremely rapid, and effects are permanent. We really haven’t had many or any of these since WW2 and the coming of the atom bomb.
One strange thing: upon the inception of any of these, one knows almost instantly which level story you’re experiencing if looked at logically and with a chess-players eye to “what must come next” as a result of what just happened.
Final very small point. These are not all bad things. Some disruptives, such as the coming of the Personal Computer and robotics, are extremely positive, and foster whole new fields of endeavor, stimulate many jobs, lots of profit, lots of Public Good.
(Until they take over, subjugate us, and kill us all, anyway.)
–for what it’s worth.
It all sounds fine, except that the real time analysis gets to be problematic. How would you take a major news story (Second Coming, just because we’re chatting on a Sunday) and have a computer take care of all the numerical processing with no human intervention?
Problem 1: Breaking news: Odd sounds of trumpets heard. How loud and widespread in decibels would it need to be and how do we scale that?
Also, how would we process a “near miss” such as a host of heaven oboe players, or cherubim choruses on French Horns?
Problem 2: How would we code three days of darkness which would be expected around in this timeframe and would that weight our modeling?
Problem 3: If the three days of darks come on or after the horns (OR woodwinds strike up) what stocks to we buy to maximize profits? Do we buy light bulb maker? But if we do that, I can’t find anything in the data gong back forever as to whether markets stayed open. About the only reference is to someone begin angry about moneychangers…
Do we automate that trade?
Perhaps this is why Sundays exist…to nibble on the future a bit before it gets here…which it’s bound to, one way or t’other. And maybe as soon as tomorrow.
Check back then and let’s see if I have a nose left …after keeping it on the grindstone for another 8