Roughly 80% of data is estimated to be in an unstructured form (Unstructured Big Data). If the goal is to improve know-how with insights from domain experts, AI/NLU algorithms must be adopted as part of the next big data analytics. They can gather contents coming from different type of sources (on-line and off-line), analysing/fusing them by deep semantic NLU algorithms and discovering insights referred to (f.e) particular threats on a dedicated population during a specific temporal reference and in a targeted GEO location.
From social media to scientific papers, from technical reports to newspapers, and more; everything becomes a useful source of information so that end-users can find the “information nuggets” that trigger new ideas or activate verifications or interventions. The ability to automatically understand natural language enables new approaches to advanced analysis; even citizens can become “sensors” for their territory by means of analysis of the human factors using avant-garde behavioural algorithms such as emotions and stylometric analysis.
The target is:
•to automatically read and understand text from acquired documents (es: financial statements and related notes, web sites, news, internal docs…);
•to automatically analyse and classify all types of documents and add semantic tags for searching (es: initiative for sustainability, Sentiment, Stakeholder, Citizenship, Governance…);
•to extract and normalize key information for ESG domain (es: Certifications, Emissions, Investments, GRI indicators…);
•to calculate the Reputation index combining emotional and behavioural scores and the level of severity of possible crimes;
•to finally provide a platform for semantic searching and dashboarding so to support the analysis of reputation information and reporting for performance information.