For a long time, manufacturing struggled less with a complete lack of data than with fragmented visibility into it. Large manufacturers often had some level of machine monitoring, MES, or reporting infrastructure, but in small and mid-sized shops, operational reality still lived mostly in disconnected systems, spreadsheets, whiteboards, and operators’ heads. The machine controls knew things. The ERP knew other things. The quoting spreadsheets knew something else. Very little of it was assembled into a coherent operational picture in real time.
That era is ending. Platforms like MachineMetrics, MTConnect-based monitoring, and a wave of newer floor-data systems have done something genuinely useful over the last decade: they've pulled data that used to be locked inside machine controls out into the open. Utilization. Cycle times. Downtime reasons. Real production rates versus quoted ones. An owner can now look at a dashboard and see, with reasonable accuracy, what every machine in the building actually did yesterday. That wasn't possible a decade ago in most shops, and the shops that have adopted these systems are making better operational decisions because of it.
So in one important sense, the data problem has been solved. Or at least, it's being solved.
But solving it has surfaced a different problem, one that was easy to miss while the machine data was still inaccessible. Now that the machines are talking, the question is what to do with what they're saying, and the answer turns out to depend on a lot of other information that was never inside the machine in the first place.
A utilization number on a dashboard is just a number. To turn it into a decision, you need to know what was running, for which customer, at what quoted rate, against what setup time, with what historical performance on similar jobs. The machine knows it ran for six hours. The ERP knows it was job 4471. The quoting spreadsheet from 2023 knows what it was supposed to take. The lead knows the fixture has been giving people trouble. The customer file knows this buyer always pushes back on revised lead times.
Five separate sources, one decision. None of them, on their own, can make it. The decision requires all of them, assembled together, at the moment somebody is trying to act.
This is the actual problem facing manufacturing right now, and it's different from the one the industry has been talking about. The data problem was about visibility. The context problem is about assembly. And the context problem has gotten harder, not easier, as more data has come online.
The list of relevant context sources for a typical operational decision is longer than people usually think. The newly-available machine data: utilization, cycle times, downtime, real production rates. The data that's always been in the ERP: jobs, schedules, material, costs, customer history. The data in quoting tools, prints, shipping systems, email threads, and customer portals. And the data that still lives in people's heads: which fixtures need repair, which materials are running low, which customers require more care.
Most shops now have access to most of these. None of them, individually, is sufficient because every real decision draws on several at once. A shop owner deciding whether to take a rush job needs the schedule, the current state of the floor, the material situation, the customer's history, and the lead's read on what's realistic. Today, assembling that takes time, too much time. The customer has already called two other shops, one of them with plenty of capacity.
The data is there. What hasn't kept pace is the layer that pulls it together at the moment a decision is being made.
It's worth pausing on what this shift actually represents, because it's bigger than better dashboards.
Traditional manufacturing software is, almost without exception, a system of record. The ERP records jobs, inventory, and transactions. The machine monitoring system records utilization and downtime. The quoting tool records estimates. Each is excellent at being authoritative about its slice of reality. None is in the business of reasoning across that reality.
What's beginning to emerge is a different category. Call it shop intelligence. Not a replacement for the systems of record, and not another dashboard layered on top of them. A reasoning layer that sits across the tools a shop already depends on, assembles context from all of them, and helps a human act on it. The defining shift, compared to traditional BI, is that Shop Intelligence isn't something you go look at. It's something you ask questions of.
The difference shows up in the kinds of questions the system can usefully answer.
A system of record can tell you that machine 4 ran for six hours yesterday. A Shop Intelligence layer can tell you that machine 4's utilization dropped 18% over the last three weeks on jobs in this material family, that two of the affected jobs are for a customer with a history of late-delivery complaints, and that the lead flagged a fixture issue in his notes last Tuesday that nobody else has seen yet.
A system of record can tell you what's on the schedule for Friday. A Shop Intelligence layer can tell you that taking the rush job will most likely push two existing jobs past their committed dates, that one of those customers tolerates a one-day slip and the other doesn't, and that the bottleneck isn't the machine the owner is worried about but the inspection step downstream of it.
These aren't autonomous decisions. The human still decides whether to take the job, whether to accept the slip, whether to adjust the quote. What changes is that the human is deciding with the reasoning already done — the relevant patterns surfaced, the risks flagged, the history pulled forward. The judgment stays with the person. The grinding work of assembling and interpreting context shifts to the system.
There's another dimension that's easy to miss: the same underlying operational data is useful in different ways to different people, at different moments.
An executive wants trend visibility and emerging risk. Are margins drifting? Which customers are quietly becoming unprofitable?
An estimator wants historical context for the job in front of them. What did similar jobs actually cost. Where did the estimate go wrong last time.
A scheduler wants real-time floor coordination. What's actually running versus what's nominally scheduled. Which jobs are at risk of slipping.
An operator wants job-specific situational awareness. What did the last person who ran this part learn the hard way.
A customer-facing person wants consolidated order visibility. Where is my customers job. When is it shipping.
Same systems, same data, five different operational context windows. A traditional ERP, by design, gives everyone roughly the same view. A shop intelligence layer can give each role the slice that's useful at their moment of decision, without anyone having to know which underlying system to query.
This was a hard problem to solve until recently. Connecting the systems took custom integration work, brittle data pipelines, and the kind of IT lift that broke every time one of the underlying systems pushed an update. Most shops looked at the cost and walked away.
What's changed is that the integration problem has gotten meaningfully easier. Language models are unusually good at reading the messy, semi-structured output of systems that weren't designed to talk to each other and producing something coherent on the other side. An ERP export, a stream of machine telemetry, a folder of emails, a quoting spreadsheet, and a set of operator notes can be combined into a usable operational view without anyone first having to normalize them into a single schema. That's a different posture toward the data, and it's the posture a shop intelligence layer has to take, because the shop is never going to consolidate everything into one place, and shouldn't have to.
The shops that figure this out will look different from the ones that don't. Not because they'll have better data. Increasingly, everyone will have better data. They'll look different because they'll be able to make better us of the data they have. Risks will get flagged before they become fires. Quotes will reflect what jobs actually cost. Schedules will be more accurate, because the real production rates from the floor will be flowing into the planning, and the system will be quietly pointing out which commitments are starting to look unrealistic.
The old narrative was that manufacturing had a data problem. For a long time, that was true. Then it became a data abundance problem. Too much information, in too many systems, with no connective tissue. The next chapter is Shop Intelligence: not autonomous shops, not AI managers, just a layer that turns all the data the industry has worked so hard to surface into the kind of contextual, prioritized, actionable intelligence that good operators have always done in their heads. The shops that build that layer first will quietly become the shops that everyone else is trying to catch up to.