Page 81 - Artificial Intelligence in the Age of Neural Networks and Brain Computing
P. 81
2. Third Gen AI 69
learned to glide over slowly in the midnight at the desert. Thus, we also need to intro-
duce two more FMFs, namely the (1) Control brake FMF that are generated by M&S
of Newton’s second law of inertia and Langevin friction coefficients, as well as me-
chanical hydraulic braking law. Secondly, we consider (2) Collision situation aware-
ness FMF, as the integration of all weather radar, lidar, optical flow. Thirdly, we
consider the location at 100 feet unit and time at minutes information into the (3)
Global position system (GPS) FMF. To help IDAV decision making, we must
employ several, all possible occurrences that could not be normalized as the
close-set probability and thus the open-set possibility of L. Zadeh FMF’s, and the
0
global position system (GPS at 100 resolution) FMF, as well as the Cloud Big Da-
tabases in the trinity: “Data, Product, User,” in positive enhancement loops. Let the
machine statistically generate all possible FMFs with different gliding distance in
triangle shape (with a mean and a variance). It is associated with different brake
stopping FMF distances for the 1000 cars to generate statistically sensor awareness
FMF. Then their FMF will be adopted by a single AV with their peer experience.
Their Boolean Logic union and intersection helps the final decision-making system.
The average behavior mimics the Wide-Sense irreversible “Older and Wiser”
“Experience-Based Expert System (EBES).” Moreover, the computer modeling
and simulation shall consider advanced computer technology MPD computing ar-
chitecture (e.g., miniaturized graphic processors 8 8 8 units in a backplane
by Nvidia, Inc.), which must match the MPD coding algorithm, for example, Python
tensor flow, like the well-fit “gloves with hands.”
We recapitulate a few important concepts as follows:
1. Analyticity: there is an analytic cost energy function of the landscape there re-
mains the set of N initial and boundary conditions that must 1-to-1 correspond
to the final set of N gradient results.
2. Causality: ANN takes from the initial labeled or unlabeled boundary conditions
to reach a definite local minimum.
3. Deep Learning (DL) adapt Big Data the connection weight matrix [W j,i ] between
j-th & i-th processor elements (about millions per layer) in multiple layers
(about 10e100). We consider unlabeled data UDL, which is based on BNN of
both neurons and glial cells, the experience-based expert system can increase
the trustworthiness, sophistication, and DARPA explainable AI (XAI).
We will introduce the Lotfi Zadeh FMF for open set occurrences in the aforemen-
tioned positive feedback loop that cannot be normalized as the probability but called
the possible space. This concept is important as the “young and beautiful” to be
much sharper possibility than either “the Young” or “the Beautiful.” When we
average over spatial cases, we obtain the average of the EBES in order to elucidate
modern AI. We considered a driverless car equipped with sensor suite (e.g., collision
avoidance with all weather W-band radar or optical LIDAR and video imaging) to
1000 identical driverless cars at the scenario of stopping at a red light.
Lastly after tightly coupled MPD Architecture and Algorithm, like a hand wear-
ing a glove, we assume large sharing Big Databases in the Cloud that allow us to