Spotlight Webinar: Hewlett Packard Enterprise

Loading Events

Description: Research projects require collaborating and the sharing of data sets with the other organizations. Although researchers have good intentions to share data, often processes and regulations prevent the sharing of data sets and delay the project start time for several months while project managers and principal investigators work through the privacy and regulatory issues.In response to these challenges, HPE had developed HPE Swarm Learning, a decentralized, block chain-based AI Model Training capability that allows multiple cohorts to keep their proprietary data sets in place without ever needing to make copies and ship data to a centralized location or to share it in place.
HPE Swarm Learning allows collaborators to receive the benefits of large data sets with-out sharing raw data or needing to duplicate data or moving data. The net result is more accurate research outcomes, and timely kick off of research projects.
Speakers: Steve Heibein, HPE AI Ambassador

Go to Top