Skip to main content

The 12th Annual Enterprise Computing Community Conference Goes Virtual

This year, the ECC conference was hosted on a virtual platform instead of a F2F meeting at Marist College as it has been done for the previous 11 conferences.

"IT Trendz" in white against a purple banner, white chat bubble in righthand corner, with dots connected by white lines against a blue background.

This is the first of two articles about the Enterprise Computing Community (ECC) conference. This year, the conference was hosted on a virtual platform instead of a F2F meeting at Marist College as it has been done for the previous 11 conferences. The ECC is a community created to revitalize undergraduate curriculum in enterprise computing. This year, the themes of the conference were quantum computing and AI and machine learning. The topics were not limited to just these subjects, but they were definitely the main thrust. I made a 15-minute presentation on enterprise middleware. Others worked around the main themes as well.
 
There are various levels of partnership for the conference and this year the sponsors included:
 
Gold sponsors Marist College, IBM and IDCP
Silver sponsors Mainline Information Systems, Broadcom, privaKey and Vicom Infinity
Bronze sponsors IEEE Mid-Hudson Section and NewEra
Academic University of Arkansas, Sam M. Walton College of Business, Department of Information Systems
Media IBM Destination Z and IBM Systems Media

Let the Fun Begin

The conference started on Sunday night, June 7 with a registration welcome and virtual platform logistics session at 6 p.m. The people from Marist who are running the conference, the guidance team, gave an interesting digital introduction in less than 15 minutes. They gave us an overview of the event sponsors, ran a virtual registration desk, and covered the basics of the tool used to support the conference.
 
The tool for the conference was Sakai, a free, community source, educational software platform designed to support teaching, research and collaboration. The tool was reliable and got the job done. It was easy to use and held up well under the use of hundreds of participants.
 

Content Presentations on Monday, June 8

The day began with a welcome to the conference session that was hosted by Roger Norton, the dean. During the video introduction, Norton passed the video content around the virtual table so other ECC community members could present which kept it interesting. After the dean conducted the first few minutes or so, Dr. Dennis Murray, president of Marist College, welcomed the participants to the 12th annual conference.

First Keynote Speaker  

At 9:30 a.m., we had our first keynote speaker, Greg Lotko, general manager and senior vice president, Mainframe Software Division at Broadcom. His presentation was titled “Opportunity Unfolding: Tracking the Adoption Journey of AI in IT Ops.”
 
In this 20-minute presentation, the speaker covered implementations of AI with IT operations in a diverse and interesting way, not limiting the examples and discussion to just the IT operations domain. Lotko indicated that organizations have been aware of the promise of AI and machine learning for advancing operations for years, but we’re now entering a new era where the explosion of data and evolving computing models are poised to take AI from niche use cases to widespread adoption, including on the mainframe. Human and machine intelligence are coming together to make people, process and platforms more efficient, secure and productive than ever before. This opens opportunities for AIOps to detect infrastructure issues sooner, diagnose them faster and foster self-healing systems through outcome-driven automation. But what are the touchstones that will determine how far and fast adoption of AIOps should or will occur? And what’s on the horizon? These topics were his focus.
 
After this first keynote, there were three 30-minute presentations sequenced one after the other. These three sessions are now discussed.

Twitter and Batch Jobs

David Follis from IBM presented “Twitter, The NYC MTA, Batch jobs, Watson Sentiment Analysis, and Sarcasm.”
 
In this 10-minute presentation, Follis specified that it all started as an internship project to build a sample Java® Batch application. The team wanted it to be something interesting and not just a boring sample. And so their intern created a batch application that consumed tweets about the New York City Metro Transit Authority and processed them through IBM Watson™. In this session Follis took a look at the twitter API to consume tweets and a small application to capture tweets about the MTA (and why the team chose the MTA). Then he looked at the JSR-352 Java Batch programming model and how we used it to process the tweets, creating a database to manage them. Also discussed was how they used Watson Analytics Sentiment Analysis to try to determine how happy (or angry) the tweet is. Follis wrapped up by considering how well (or poorly) the analytics are at recognizing sarcasm. 

Quantum Computer Algorithms and Programming

Next, Casimer DeCusatis from Marist College presented “Quantum Computer Algorithms and Programming.” 
 
In this 15-minute presentation, DeCusatis shared that quantum computing holds the promise for revolutionizing our approach to data encryption, search algorithms, network and graph optimization, modeling of chemical and biological systems, and much more. Certain types of problems with exponential time complexity on conventional computer architectures can be solved much faster using quantum computers. Now that the first generation of quantum computers have been developed, there is significant interest in learning quantum computer programming in preparation for the availability of larger, more powerful quantum computers. In this presentation, DeCusatis described a new 12-week online course in quantum computer algorithms and programming at Marist College. This course provides the unique opportunity for students to learn how to program an actual quantum computer, the 10-20 qubit IBM System Q. The course is designed to be accessible to undergraduate, graduate and adult students with no previous background in quantum computing.

Privakey’s Journey to the Secure Cloud With Linux on IBM Z 

At 11:30 a.m., there was a panel titled “Privakey’s Journey to the Secure Cloud with Linux® on IBM Z® and LinuxONE® and Improving Security and Customer Experience for High Risk/High Value Digital Transactions.” The session was moderated by Len Santalucia from Vicom Infinity, with participants including Alex Kim and Vincent Terrone from Vicom Infinity, as well as Pat Kelly and Brian Ross from Privakey.  
 
In this 30-minute presentation, the panelists discussed Privakey’s porting experience to Linux on IBM Z and HyperProtect Services on LinuxONE and how this experience led IBM to select Privakey into the IBM Cloud® Accelerator Program and become a new ISV now available for Linux on IBM Z. Also, during this panel/presentation, the panel discussed the exciting new capabilities that Privakey CX brings to IBM LinuxONE and IBM Z. Privakey CX offers a next generation utility for multifactor authentication and transaction intent verification designed to operate with IBM Secure Service Container and Hyper Protect Services. This technology secures and simplifies transaction flows and business workflows to create efficiencies and reduce attack surfaces. Financial institutions, online merchants, healthcare organizations and other service providers are turning to Privakey to deliver better customer experiences, improve their security posture and comply with new regulations.
 
After this presentation, we had a lunch and fitness break with the ability to select from four fitness videos created by Marist College students sponsored by Vicom Infinity and PrivaKey.

The Quantum Decade         

At 1 p.m., we had our second keynote speaker, Scott Crowder, chief technical officer and vice president, quantum computing, technical strategy and transformation for IBM Systems. His presentation was titled “IBM Entering the Quantum Decade.”

In this 25-minute video presentation, Crowder discussed how quantum is different from the computing that’s in wide-scale use today. Quantum is novel. As a tool, it has a set of “easy problems to solve” that is broader than conventional computing. The problems to be solved by quantum are of significant value to society and business. “Quantum computers are the only novel hardware which changes the game.” 

For the next hour, we had two 30-minute presentations.

Cybersecurity Operations

At 2 p.m., David Santeramo from Mainline Information Systems Inc. presented “AI in Cybersecurity Operations.”

During this 20-minute long presentation, Santeramo shared that 2019 saw a significant increase in the number of cyberattacks targeted toward the education sector. Attacks such as adware compromises, phishing attacks, ransomware as well as the threats from state actors targeting the university research community have increased the demands on cybersecurity teams in the education sector.

Today’s cybersecurity operations teams are barely able to keep up with the ever-increasing threat. There are not enough fingers and eyes available to keep up with the increasing demand. The present work assignment duration of a cybersecurity analyst is 18 months with fatigue being the most common reason for leaving. When an analyst leaves an organization, typically most of the knowledge realized departs with the individual. Can AI be used to aid organizations in retaining talent and knowledge? What role can AI play as part of a cybersecurity operations team daily threat analysis and response?   

This presentation was focused on techniques, tools and processes that institutions can utilize to incorporate AI into their environment. This presentation explained the challenges that the education sector faces today and in the future regarding cybersecurity.  

Machine Learning in Healthcare Reimbursement Models

At 2:30 p.m., Juan Arias from Marist College presented “Machine Learning in Healthcare Pay-for-Performance Reimbursement Models.”

During the presentation, Arias indicated that health policy has concentrated on increasing access, containing costs and improving quality. Pay-for-performance has gained popularity as a means to improve quality of care, as well as improving efficiency and reducing costs. One of the goals of the Affordable Care Act was to reduce U.S. healthcare spending and improve the quality of healthcare. Models have been created to reimburse providers based on meeting specific outcomes instead of paying for the services provided.  

Payments should be related to performance that can be demonstrated on the objectives established by the payer. The success or failure of this framework depends on how payers evaluate performance and structure incentives. Implementing pay-for-performance is usually difficult. Achieving a valid reliable, and comprehensive measurement of performance in an area as complex as medical care is extremely challenging.  

Models that consider all patients at the same level of risk are clinically ineffective and prohibitively expensive. To maximize efficiency and improve outcomes, health centers must analyze their patient population and customize care and interventions based on identified risks and costs. When defining the problem using the machine learning model, one of the most critical steps while creating a model is to have an adequate definition of the problem. In this case, we’re trying to classify patients by their “complexity” based on their utilization of healthcare resources. 

This sounds relatively simple. However, the complexity is that the patients that use the most resources are not necessarily the ones we’re interested in. Patients that suffer from terminal diseases sometimes make the most use of available resources, but there’s not much that can be expected from provider intervention.

I appreciated learning that we can use machine learning techniques to help build models that reflect the risky aspects of patient populations and these models can be used to create the right incentives for providers to address the cohort of patients that will have in the largest impact to the cost/benefit equation. Finally, I learned that we can extend these models to include other factors like the social determinants of health.

IBM/MIT/Watson Lab Research in AI

Later, we had an invited speaker, John Cohn, an IBM Fellow at the MIT-IBM Watson AI Lab. His topic was IBM/MIT/Watson Lab Research in AI.

This was a 30-minute video presentation full of amazing facts and ideas. Cohn’s main discussion included the evolution of AI, involving narrow AI, Broad AI and General AI. Also discussed was a platform for AI lifecycle including compute, data and models, applications and workflow. The material had an industry and academia context. Cohn included many examples from the MIT-IBM Watson AI lab like large-scale video understanding and machine learning and the malware arms race. 

For the next hour, we had two 30-minute presentation slots with two presentations to choose from.

Re-Thinking DataPrep        

Ravi Chandran from XtremeData Inc. presented “Re-thinking DataPrep on the z-Architecture” and Tamer Abuelsaad from IBM presented “Designing an Intelligent Tutoring System for Early Childhood Learning Vocabulary Acquisition.” I attended the session presented by Mr. Chandran. 

During this 28-minute presentation, Chandran explained that re-thinking DataPrep on the z-Architecture. The field of “DataPrep” has been rapidly growing, driven by two major factors: large and growing data volumes, and the de-coupling of data from legacy integrated systems into object-stores and data-lakes. Legacy systems, such as EDWs from Teradata, IBM, Oracle-EMC,  incorporated both data and the processing engine (SQL). This has been disaggregated with the rise of public and private clouds, where data storage is separate from the processing platform.  

Object-stores and data-lakes are now being filled with data from disparate legacy systems as well as from newer, online data sources.  This has created a demand for data-preparation tools that can access, transform, cleanse and integrate these data sources.  At small scales, many such tools exist in the market.  Large scales require large computation capacity and therefore require a re-thinking towards distributed computing for scalability.   

In this session, Chandran analyzed the requirements for DataPrep and presented a case for the IBM Z architecture providing a high-density, efficient hardware platform for large-scale data processing. I learned the most from the landscape (like large and growing data volumes) and requirements (like scalability and fast ingest and fast export of data) materials. 

Wine Match: What’s the Wine for You?

At 4:30 p.m., there were also two choices. Cindy Rodriguez, Hemanginiben Chauhan and Deepthi Srinivasan from Marist College presented “Wine Match-What’s the Wine for You? Utilizing Machine Learning for Wine Tasting & Pairing” while Kevin Bruckner from IBM presented “Art with IBM Watson.” I attended the wine matching presentation.  

During this 16-minute long presentation, Rodriguez, Chauhan and Srinivasan shared that wines are made from different varietals and are bottled all around the world. They are an integral part of the dining experience. Restaurants invest in professional wine tasters (sommeliers) who help create a database of wines and help them pair them up with foods. The researchers from Marist have tried to create classification and predictive techniques (using SPSS) by which a machine could act as a sommelier. By training models over various sommelier descriptions of different wines, the model would be able to classify the wines by assigning points. 

The future is to embed the information in a QR code that can be attached to the wine bottles, which can be used to collect information about the wine. The data set is data mined from the wine enthusiast magazine edition of 2017 by Zackthoutt. From the presentation, I realized that If one encounters a new wine, the predictive model can be used to identify the rating and could act as a suggestion for anyone who needs a recommendation. 

The day ended with Art Gallery tours opened from 5 to 7 p.m. During the tour, you could Immerse yourself in visits of Marist College faculty personal art studios and private collections. The artists included Ed Smith, Julia Whitney Barnes, Donise English, Matt Frieburghaus, Marist AM FM Magazine, and Joyce Yu-Jean Lee. Donise English, Professor of Studio Art, School of Communication and the Arts at Marist College coordinated the art videos. Through these videos, I learnt about the artists’ creative techniques and their passions. This event was Sponsored by IBM. 

Next Post

Next week, I’ll discuss the final day of the ECC Conference. 

IBM Systems Webinar Icon

View upcoming and on-demand (IBM Z, IBM i, AIX, Power Systems) webinars.
Register now →