The basic difference is that a "transport" lets you exchange blocks of opaque data, while a "framework" exchanges data structures of known content. These sound similar, but they actually are profoundly different.
A transport can send data back and forth, but all interpretation is up to the application. This is called "technical interoperability". A great analogy: two people are technically interoperable if they can speak and listen but perhaps not the same language. That lets you communicate, but translating the information is up to the people.
A connectivity framework exchanges data structures, so it knows about types and field names, aka "syntactic" interoperability. This is like two people who speak the same language, or maybe with an interpreter who translates between them. A connectivity framework allows applications to communicate, even if they are written in different languages, talk over different networks, run on different CPUs, or use different operating systems.
The IICF also defines another level, called semantic interoperability. Two people are semantically interoperable if they not only speak the same language, but they also have similar educational backgrounds so they can collaborate on specific tasks. The IICF doesn't describe this level because semantically interoperable systems in practice address particular application verticals.
Syntactic interoperability is the key level analyzed by the IICF because it's needed to build complex industrial systems. A framework that understands data structures can provide configurable quality-of services (QoS) that enable control over data delivery rates, reliability, durability, filtering, persistence, and data liveliness. Frameworks that understand the data can implement generic tools that use the data, interact with databases and other data-aware functions, and control program interfaces. Together, these functions enable a “data model” for the system, allowing diverse components to work together. Sophisticated implementations can even match some differences in data model, thus allowing a large distributed system to grow incrementally from parts that are not all developed or deployed together. Syntactic interoperability is the critical functionality for large software systems.
Stated more simply, connectivity frameworks are called "frameworks" because they provide an architecture. While transports can send data, frameworks provide a structure that defines how and when applications interact. Choosing a framework is often the first and most important step in building an intelligent distributed application.
What are the top DX outcomes in the automotive industry… pay per use, pay as you drive, etc.?
"Edge is in the eye of beholders!" If you are a qualified system architecture, you should know clearly which parts are considered "edge"; otherwise, you may need to look for a real job.
Most often, different trade has formed, through their common practices, a common "understanding" about where an "edge" is. For example, I have seen the "Data Center" community indicate they are "the edge" in the Cellular Network Comminity in the most recent "Edge Congress" meeting I chaired in Austin. The view is already different than Stan's thought already.
>Are there definitions of the different edges for Industrial ?
From English, edge is a transition point; e.g., the edge of a table or the edge of the ocean. In IoT there are several important edges, including:
1- The sensor, where the world meets cyber space (e.g., temperature is sensed into a digital value).
2- Where the direct link or the LAN meets the WAN, usually in the IoT gateway. The security posture on one side is very different from the other, thus an important transition point.
3- Where the cellular wireless meets the fiber backbone at the cellular tower; what the Telcos call MEC for Multi-access Edge Computing.
In general, a tree hierarchy of compute aggregation tiers (aka, multi-edges) that goes from the numerous to the few could be an interesting organizing paradigm for certain use cases.
As a label, Blockchain/DLT tends to be bandied about in the same manner as IoT (a few years ago) and AI/ML (at present). Approaching the challenge from an applications requirement perspective, it's apparent that other technologies can solve the underlying problem. Here are some issues to consider - https://www.more-with-mobile.com/2018/03/blockchain-and-mobile-industry.html
It is also important to understand the second-order implications of Blockchain/DLT (permissioned or not). In the case of supply chains, to pick one application domain, participants need to assess the benefits of tracking and tracing alongside the exposure of competitive intelligence. For example, does a business want its supply arrangements - quantities, frequency, variance over time etc. - to be visible to the entire marketplace or the permission-granting entity?
I agree with Marcellus
Software Trustworthiness is only one aspect of what it takes to achieve IoT trustworthiness. Software trustworthiness is achieved when operational consumers of the software have developed a level of trust that the software enabled functionality will function as expected in normal and abnormal operational circumstances. Software needs to be resilient and of high quality. Creating high quality software requires high attention to detail, adherence to a well-defined multi-phases process that includes specification, architecture, implementation, functional testing, security testing and applying software protection techniques where applicable.
IoT trustworthiness needs to address many additional aspects such as device identity, safety, secure communication, operational in circuit functional up-gradeability as well as resilience to a multitude of attacks that comes with simply being connected to the internet. Given the sheer size of the IoT, solutions need to be automated, auditable, standardized and scalable.
However, there is a more intangible aspect to IoT and Software trustworthiness that is very hard to measure. Organizational health is a serious issue. Time and profit pressures may cause organizations to act in ways that lead to poor quality and insecure products.
End users, whether corporate or private, of IoT enabled functionality need assurance that can only come from levels of trust that have engendered over a long period of time as products and the organizations that create and operate IoT products act in a transparent, measurable and ethical manner.