- Scientific Computing and Data Management
- Distributed and Parallel Computing Systems
- Semantic Web and Ontologies
- Research Data Management Practices
- AI-based Problem Solving and Planning
- Data Quality and Management
- Logic, Reasoning, and Knowledge
- Biomedical Text Mining and Ontologies
- Advanced Data Storage Technologies
- Multi-Agent Systems and Negotiation
- Business Process Modeling and Analysis
- Geological Modeling and Analysis
- Topic Modeling
- Big Data and Business Intelligence
- Explainable Artificial Intelligence (XAI)
- Reservoir Engineering and Simulation Methods
- Personal Information Management and User Behavior
- Intelligent Tutoring Systems and Adaptive Learning
- Wikis in Education and Collaboration
- Natural Language Processing Techniques
- Advanced Database Systems and Queries
- Cell Image Analysis Techniques
- Service-Oriented Architecture and Web Services
- Bioinformatics and Genomic Networks
- Cloud Computing and Resource Management
University of Southern California
2016-2025
Viterbo University
2024
Southern States University
2013-2023
Conference Board
2023
Southern California University for Professional Studies
2013-2022
Association for the Advancement of Artificial Intelligence
2019
Association for Computing Machinery
2017-2019
Virginia Commonwealth University
2018
QIMR Berghofer Medical Research Institute
2018
Marina Del Rey Hospital
2001-2016
This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. enables users represent at an abstract level without needing worry about particulars of target execution systems. The general issues in mapping applications and functionality Pegasus. We present results improving application performance through workflow restructuring which clusters multiple tasks a into single entities. A real‐life astronomy is as basis for study.
Workflows have emerged as a paradigm for representing and managing complex distributed computations are used to accelerate the pace of scientific progress. A recent National Science Foundation workshop brought together domain, computer, social scientists discuss requirements future applications challenges they present current workflow technologies.
Grid applications require allocating a large number of heterogeneous tasks to distributed resources. A good allocation is critical for efficient execution. However, many existing grid toolkits use matchmaking strategies that do not consider overall efficiency the set be run. We identify two families resource algorithms: task-based algorithms, greedily allocate resources, and workflow-based search an entire workflow. compare behavior algorithms using simulations workflows drawn from real...
Data, code, and workflows should be available cited
Cognitive neuroscience aims to map mental processes onto brain function, which begs the question of what "mental processes" exist and how they relate tasks that are used manipulate measure them. This topic has been addressed informally in prior work, but we propose cumulative progress cognitive requires a more systematic approach representing entities being mapped function processes. We describe new open collaborative project provide knowledge base for neuroscience, called Atlas (accessible...
Abstract Geoscientists now live in a world rich with digital data and methods, their computational research cannot be fully captured traditional publications. The Geoscience Paper of the Future (GPF) presents an approach to document, share, cite all products including data, software, provenance. This article proposes best practices for GPF authors make methods openly accessible, citable, well documented. publication objects empowers scientists manage as valuable scientific assets open...
Artificial intelligence has progressed to the point where multiple cognitive capabilities are being integrated into computational architectures, such as SOAR, PRODIGY, THEO, and ICARUS. This paper reports on PRODIGY architecture, describing its planning problem solving touching upon learning methods. Learning in occurs at all decision points integration is knowledge level; reasoning modules produce mutually interpretable structures. Issues architectural design discussed, providing a context...
Describes the Wings intelligent workflow system that assists scientists with designing computational experiments by automatically tracking constraints and ruling out invalid designs, letting focus on their goals.
Many human activities are a bottleneck in progress
Computational workflows describe the complex multi-step methods that are used for data collection, preparation, analytics, predictive modelling, and simulation lead to new products. They can inherently contribute FAIR principles: by processing according established metadata; creating metadata themselves during of data; tracking recording provenance. These properties aid quality assessment secondary usage. Moreover, digital objects in their own right. This paper argues principles need address...
How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition reader will already know that answer with difficulty not at all. In this paper we attempt quantify by reproducing previously published for different classes of users (ranging from little expertise domain experts) and suggest ways which situation might be improved. Quantification achieved estimating time required each steps method described original make them part an...
A research agenda for intelligent systems that will result in fundamental new capabilities understanding the Earth system.
Artificial intelligence, like any science, must rely on reproducible experiments to validate results. Our objective is give practical and pragmatic recommendations for how document AI research so that results are reproducible. analysis of the literature shows publications currently fall short providing enough documentation facilitate reproducibility. suggested best practices based a framework reproducibility given by scientific organizations, scholars, publishers. We have made checklist our...
A key challenge for grid computing is creating large-scale, end-to-end scientific applications that draw from pools of specialized components to derive elaborate new results. We develop Pegasus, an AI planning system which integrated into the environment takes a user's highly specified desired results, generates valid workflows take account available resources, and submits execution on grid. also begin extend it as more distributed knowledge-rich architecture.