A brand new graph-structure, based on millions of pre-modeled terms, concepts and cognitive relationships, dramatically reduces the training time and supports performances, which are not possible with traditional networks.
Axél’s knowledge base depends on an extensive list of nodes, some simple other complex, which implement concepts and semantic connections. As opposed to traditional solutions, Axél uses these relationships to lead to a specific inference, thanks to its proprietary algorithm, which is based on an extremely innovative neural network.
Standard networks are very powerful and can normally deal with different issues. At the same time, however, they present some significant problems:
- Modelling a neural network, namely managing its code, is extremely complicated, and it actually represents a NP-complete decision problem;
- Once the network has been modeled, the system remains absolutely strict, as no change can be applied to the nodes, without repeating all the training process while facing the issues mentioned above.
As opposed to these solutions, the algorithm developed by axélero produces a graph based on millions of nodes and relationships, which draws inspiration from quantum theory, as each node of the graph acts as a qubit. That means it can take different values at the same time and get exclusive performances, such as:
- Managing verb tense;
- Managing inflected forms;
- Examining customers’ behavior and tendencies.
NATURAL LANGUAGE PROCESSING E UNDERSTANDIG (NLP/NLU)
Dealing with natural language processing and understanding (NLP/NLU), the system created by axélero’s does not need to modify the size of the neural network or its connections.
If, for instance, in a legal context, we wanted to find all the judgements for the sentence “May I buy an iguana?”, the deductive process would be:
1.Parse of the subgraph which contains the concepts included in sentence;
2.Activation of the nodes matching the input data;
3.Propagation of the activated signals to the subsequent nodes;
4.Detection of the right inference, between all the available, that gets the result of the query.
– Definition of context;
– Grammar analysis;
– Syntax analysis:
– Semantic analysis;
– Development of the cognitive graph.
(N.B.: The subgraph shown in the above example is not a simple semantic map, but a real neural network.)
The outcome of the combined action between databases and algorithms leads to several great advantages:
● Significant reduction of the training time, since, as a result of being concept-based:
– the system does not require a massive quantity of examples nor a huge amount of test cases;
– the connections between the concepts will spread through all the possible concepts and inferences, and because of that, make the entire training process smoother and quicker.
● A better level of precision and accuracy, compared to standard solutions.
● A dynamic structure, provided with nodes, concepts and inferences, which can be added or removed in real time, without affecting the whole system. On the other hand, traditional networks need a brand new training each time their database is changed.
● Proprietary tools that can easily increase the knowledge base, by getting new information from different types of documents and datasources.
● As for image scanning is concerned, distinction between the analysis of image contents and the process that actually leads to an inference, so that Axél will get the best performances from both the features. The greatest results convolutional neural networks (CNN) can achieve are only related to the scan of images, whereas axélero’s solution is able to return real inferences in an independent way. For this reason, if you want to add, modify or delete a concept, this will require just one single action, with no need to repeat the whole training process – i.e. to spend several hours into the development of the software, consequently affecting the overall performance.