Charles River Analytics Inc., developer of intelligent systems solutions, proudly announces that Dr. Avi Pfeffer, our Chief Scientist, will present a new type of probabilistic cognitive architecture during an invited talk at a Declarative Learning-Based Programming workshop. The workshop is held in conjunction with the Association for the Advancement of Artificial Intelligence’s Thirty-Second Conference on Artificial Intelligence (AAAI-18), which takes place in New Orleans, LA, from February 2 to 7, 2018.
Scruff™: A Deep Probabilistic Cognitive Architecture
Probabilistic programming is a modeling tool that borrows lessons from programming languages and applies them to the problems of designing and using models of uncertainty. Scientists construct these models already, but it is a difficult and cumbersome process. Probabilistic programming simplifies this process by providing an easy-to-use language for representing models and reasoning algorithms to run the models.
In recent years, deep neural networks have had significant success in many applications, such as computer vision, speech recognition, and natural language understanding. One of the reasons for the success of deep neural networks is their ability to discover hidden features of the domain through complex, multi-layered, nonlinear functions. Another reason is the deep network’s ability to learn and reason effectively about these functions in a scalable way. Given big data, deep neural networks are tremendously effective. However, deep neural networks usually require a large amount of data to learn; when data is sparse, prior knowledge is needed, which can be well encoded by probabilistic programs.
Dr. Pfeffer hypothesizes that general cognition and learning can be modeled by fusing numerous deep learning functions (creating “scruffiness”) within a general, coherent cognitive architecture—a “neat” probabilistic program. Scruff™ provides the ability to learn hidden features of the domain, similarly to a neural network, using a probabilistic program. The Scruff architecture ensures that any model built makes sense, yielding explainable behaviors and efficient reasoning mechanisms.
“Scruff uses Haskell’s rich type system to create a library of models, where each kind of model can support certain kinds of inference efficiently,” Dr. Pfeffer. “Using Scruff, we can combine many kinds of deep models, such as deep noisy-or networks, deep probabilistic context free grammars, and deep conditional linear Guassian networks.”
Dr. Pfeffer has also developed an open-source, probabilistic programming language called Figaro™. He is the author of Practical Probabilistic Programming, which introduces readers to probabilistic programming and enables those without experience in machine learning to create rich, probabilistic modeling applications using Figaro.
For more information on Scruff, Figaro, or our related capabilities, contact us.
To learn more about Charles River or our current projects and capabilities, contact us.
About Charles River Analytics: Since 1983, Charles River Analytics has been delivering intelligent systems that transform our customers' data into mission-relevant tools and solutions to support critical assessment and decision-making. Charles River continues to grow its technology, customer base, and strategic alliances through research and development programs for the DoD, DHS, NASA, and the Intelligence Community. We address a broad spectrum of mission areas and functional domains, including sensor and image processing, situation assessment and decision aiding, human systems integration, cyber security, human-robot interaction, and robot localization and automation. These efforts have resulted in a series of successful products that support continued growth in our core R&D contracting business, as well as the commercial sector. Charles River became an employee-owned company in 2012, to set the stage for the next-generation of innovation, service, and growth.