Publications
Here you can find the complete list of my publications. You can use the tag cloud below to select only the papers dealing with specific research topics. You can expand the Abstract, Links and BibTex record of each paper.
2023
Augello, Agnese; Caggianese, Giuseppe; Gallo, Luigi
VITE I Conference: Contributes in the Frame of a Human Augmentation Space Journal Article
In: Journal of the Italian Astronomical Society, vol. 94, no. 1, 2023.
Abstract | BibTeX | Tags: Artificial intelligence, Augmented Reality, Enactivism, Human augmentation, Virtual Reality
@article{augelloVITEConferenceContributes2023,
title = {VITE I Conference: Contributes in the Frame of a Human Augmentation Space},
author = { Agnese Augello and Giuseppe Caggianese and Luigi Gallo},
year = {2023},
date = {2023-01-01},
journal = {Journal of the Italian Astronomical Society},
volume = {94},
number = {1},
abstract = {Our contribution is to examine some of the works presented during the VITE I conference from a perspective of Human Augmentation (HA). In the paper, we provide a definition of HA framed by Enactivism theory, also schematizing our viewpoint in a threedimensional space and into an architecture for designing and implementing HA systems.},
keywords = {Artificial intelligence, Augmented Reality, Enactivism, Human augmentation, Virtual Reality},
pubstate = {published},
tppubtype = {article}
}
2022
Casoria, Luigi; Gallo, Luigi; Caggianese, Giuseppe
Safeguarding Face-To-Face Communication in Augmented Reality: An Adaptive Interface Proceedings Article
In: 2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), pp. 127–132, IEEE, 2022, ISBN: 978-1-66548-574-6.
Abstract | Links | BibTeX | Tags: Adaptive interface, Augmented Reality, Data visualization, Mobile computing, Patient monitoring, Ubiquitous computing, User interface
@inproceedings{casoriaSafeguardingFaceToFaceCommunication2022,
title = {Safeguarding Face-To-Face Communication in Augmented Reality: An Adaptive Interface},
author = { Luigi Casoria and Luigi Gallo and Giuseppe Caggianese},
doi = {10.1109/MetroXRAINE54828.2022.9967661},
isbn = {978-1-66548-574-6},
year = {2022},
date = {2022-10-01},
urldate = {2023-03-15},
booktitle = {2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE)},
pages = {127--132},
publisher = {IEEE},
abstract = {Recent advances in wearable augmented reality devices foster the vision of ubiquitous interaction in an immersive, digitally augmented, physical world. Assuming that such devices could one day replace smartphones for accessing information, creating interfaces safeguarding face-to-face communication is challenging. This work presents the design of an interface that adapts the information visualisation to the presence of a possible interlocutor while allowing a high level of user control. The aim was to define an interface for wearable devices adaptive to interactions coming from the surrounding environment and expressly thought for application domains in which it will be necessary to continuously monitor information. For instance, those applications that require monitoring patient data in medical applications or the progress of a production process in an industrial environment. We focused on human-to-human communication, minimising the use of mid-air interaction to hide the synthetic information that might interrupt the conversation flow. Two different visualisation modalities allowing the coexistence of real and virtual worlds are proposed and evaluated in a preliminary study with six participants who showed a generalised appreciation for the solution which maximises the display of information requiring less user intervention.},
keywords = {Adaptive interface, Augmented Reality, Data visualization, Mobile computing, Patient monitoring, Ubiquitous computing, User interface},
pubstate = {published},
tppubtype = {inproceedings}
}
2019
Caggianese, Giuseppe; Colonnese, Valerio; Gallo, Luigi
Situated Visualization in Augmented Reality: Exploring Information Seeking Strategies Proceedings Article
In: 2019 15th International Conference on Signal-Image Technology Internet-Based Systems (SITIS), pp. 390–395, 2019.
Abstract | Links | BibTeX | Tags: Augmented Reality, Human computer interaction, Task analysis, Visualization
@inproceedings{caggianeseSituatedVisualizationAugmented2019,
title = {Situated Visualization in Augmented Reality: Exploring Information Seeking Strategies},
author = { Giuseppe Caggianese and Valerio Colonnese and Luigi Gallo},
doi = {10.1109/SITIS.2019.00069},
year = {2019},
date = {2019-11-01},
booktitle = {2019 15th International Conference on Signal-Image Technology Internet-Based Systems (SITIS)},
pages = {390--395},
abstract = {In recent years augmented reality applications have been increasingly demonstrating the requirement for an interaction with information related to and directly shown in the surrounding environment. Situated information is visualized in its semantic and spatial context, building up an environment enhanced by an information level that dynamically adapts to the production of the information and to the actions of the user. The exploration and manipulation of this type of data through see-through augmented reality devices still represents a challenging task. The development of specific interaction strategies capable to mitigating the current limitations of augmented reality devices is essential. In this context, our contribution has been to design possible solutions to address some of these challenges allowing a dynamic interaction with situated information. Following the visual "information-seeking mantra" proposed by Shneiderman and introducing some "superpowers" for the users, in this work we present different strategies aimed at obtaining an overview and filtering, and acquiring details of a collection of situated data.},
keywords = {Augmented Reality, Human computer interaction, Task analysis, Visualization},
pubstate = {published},
tppubtype = {inproceedings}
}
2017
Brancati, Nadia; Caggianese, Giuseppe; Frucci, Maria; Gallo, Luigi; Neroni, Pietro
Experiencing Touchless Interaction with Augmented Content on Wearable Head-Mounted Displays in Cultural Heritage Applications Journal Article
In: Personal and Ubiquitous Computing, vol. 21, no. 2, pp. 203–217, 2017, ISSN: 1617-4909, 1617-4917.
Abstract | Links | BibTeX | Tags: Augmented Reality, Point-and-click interface, RGB-D, Touchless interaction, User study
@article{brancatiExperiencingTouchlessInteraction2017,
title = {Experiencing Touchless Interaction with Augmented Content on Wearable Head-Mounted Displays in Cultural Heritage Applications},
author = { Nadia Brancati and Giuseppe Caggianese and Maria Frucci and Luigi Gallo and Pietro Neroni},
doi = {10.1007/s00779-016-0987-8},
issn = {1617-4909, 1617-4917},
year = {2017},
date = {2017-01-01},
urldate = {2016-12-06},
journal = {Personal and Ubiquitous Computing},
volume = {21},
number = {2},
pages = {203--217},
abstract = {The cultural heritage could benefit significantly from the integration of wearable augmented reality (AR). This technology has the potential to guide the user and provide her with both in-depth information, without distracting her from the context, and a natural interaction, which can further allow her to explore and navigate her way through a huge amount of cultural information. The integration of touchless interaction and augmented reality is particularly challenging. On the technical side, the human-machine interface has to be reliable so as to guide users across the real world, which is composed of cluttered backgrounds and severe changes in illumination conditions. On the user experience side, the interface has to provide precise interaction tools while minimizing the perceived task difficulty. In this study, an interactive wearable AR system to augment the environment with cultural information is described. To confer robustness to the interface, a strategy that takes advantage of both depth and color data to find the most reliable information on each single frame is introduced. Moreover, the results of an ISO 9241-9 user study performed in both indoor and outdoor conditions are presented and discussed. The experimental results show that, by using both depth and color data, the interface can behave consistently in different indoor and outdoor scenarios. Furthermore, the results show that the presence of a virtual pointer in the augmented visualization significantly reduces the users error rate in selection tasks.},
keywords = {Augmented Reality, Point-and-click interface, RGB-D, Touchless interaction, User study},
pubstate = {published},
tppubtype = {article}
}
2016
Caggianese, Giuseppe; Gallo, Luigi; Neroni, Pietro
Touchless Disambiguation Techniques for Wearable Augmented Reality Systems Proceedings Article
In: Pietro, Giuseppe De; Gallo, Luigi; Howlett, Robert J.; Jain, Lakhmi C. (Ed.): Intelligent Interactive Multimedia Systems and Services 2016, pp. 547–556, Springer International Publishing Switzerland, Puerto de la Cruz, Tenerife, Spain, 2016, ISBN: 978-3-319-39344-5 978-3-319-39345-2.
Abstract | Links | BibTeX | Tags: Augmented Reality, Depth ray, SQUAD, Touchless interaction
@inproceedings{caggianeseTouchlessDisambiguationTechniques2016,
title = {Touchless Disambiguation Techniques for Wearable Augmented Reality Systems},
author = { Giuseppe Caggianese and Luigi Gallo and Pietro Neroni},
editor = { Giuseppe De Pietro and Luigi Gallo and Robert J. Howlett and Lakhmi C. Jain},
doi = {10.1007/978-3-319-39345-2_48},
isbn = {978-3-319-39344-5 978-3-319-39345-2},
year = {2016},
date = {2016-06-01},
urldate = {2016-12-06},
booktitle = {Intelligent Interactive Multimedia Systems and Services 2016},
volume = {55},
pages = {547--556},
publisher = {Springer International Publishing Switzerland},
address = {Puerto de la Cruz, Tenerife, Spain},
series = {Smart Innovation, Systems and Technologies},
abstract = {The paper concerns target disambiguation techniques in egocentric vision for wearable augmented reality systems. In particular, the paper focuses on two of the most commonly used selection techniques in immersive environments: Depth Ray and SQUAD. The design and implementation of such techniques in a touchless augmented reality interface, together with the results of a preliminary usability evaluation carried out with inexpert users, are discussed. The user study provides insights on users' preferences when dealing with the precision-velocity trade-off in selection tasks, carried out in an augmented reality scenario.},
keywords = {Augmented Reality, Depth ray, SQUAD, Touchless interaction},
pubstate = {published},
tppubtype = {inproceedings}
}
2015
Brancati, Nadia; Caggianese, Giuseppe; Pietro, Giuseppe De; Frucci, Maria; Gallo, Luigi; Neroni, Pietro
Usability Evaluation of a Wearable Augmented Reality System for the Enjoyment of the Cultural Heritage Proceedings Article
In: 2015 The 11th International Conference on Signal-Image Technology and Internet-Based Systems (SITIS), pp. 768–774, IEEE, Bangkok, Thailand, 2015, ISBN: 978-1-4673-9721-6.
Abstract | Links | BibTeX | Tags: Augmented Reality, Cultural heritage, Touchless interaction, User study
@inproceedings{brancatiUsabilityEvaluationWearable2015,
title = {Usability Evaluation of a Wearable Augmented Reality System for the Enjoyment of the Cultural Heritage},
author = { Nadia Brancati and Giuseppe Caggianese and Giuseppe De Pietro and Maria Frucci and Luigi Gallo and Pietro Neroni},
doi = {10.1109/SITIS.2015.98},
isbn = {978-1-4673-9721-6},
year = {2015},
date = {2015-11-01},
urldate = {2016-12-06},
booktitle = {2015 The 11th International Conference on Signal-Image Technology and Internet-Based Systems (SITIS)},
pages = {768--774},
publisher = {IEEE},
address = {Bangkok, Thailand},
abstract = {The recent availability of low cost wearable augmented reality (WAR) technologies is leveraging the design of applications in the cultural heritage domain in order to support users in their emotional journey among the cultural artefacts and monuments of a city. In this paper, we describe a user study evaluating the usability of a wearable augmented reality touchless interface for the enjoyment of the cultural heritage in outdoor environments. The usability evaluation has been carried out in out-of-lab settings with inexperienced users, during a three day exhibition in the city of Naples. The presented results are related to the ease of use and learning of the system, and to the user's satisfaction in the enjoyment of the system.},
keywords = {Augmented Reality, Cultural heritage, Touchless interaction, User study},
pubstate = {published},
tppubtype = {inproceedings}
}
Caggianese, Giuseppe; Gallo, Luigi; Neroni, Pietro
User-Driven View Management for Wearable Augmented Reality Systems in the Cultural Heritage Domain Proceedings Article
In: 2015 10th International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC), pp. 545–550, 2015.
Abstract | Links | BibTeX | Tags: Augmented Reality, Context awareness, Cultural heritage, Ego-Vision, Visualization
@inproceedings{caggianeseUserDrivenViewManagement2015,
title = {User-Driven View Management for Wearable Augmented Reality Systems in the Cultural Heritage Domain},
author = { Giuseppe Caggianese and Luigi Gallo and Pietro Neroni},
doi = {10.1109/3PGCIC.2015.90},
year = {2015},
date = {2015-11-01},
booktitle = {2015 10th International Conference on P2P, Parallel, Grid, Cloud and Internet Computing (3PGCIC)},
pages = {545--550},
abstract = {The recent availability of low cost wearable augmented reality (WAR) technologies, is leveraging the design of applications in many different domains in order to support users in their daily activities. For most of these domains, the large amount of information displayable on top of the reality, directly in the user's field of view, represents an important challenge for designers. In this paper we present a view management technique for placing touristic/cultural information, in the form of points of interest (POIs), in an AR system that works in the absence of a priori knowledge of the real environment. The user-driven view management technique, designed as a remote service, improves representation and displacement of the digital information each time the user manifests an interest in a particular area of the real space. The proposed approach includes a layout algorithm, which exploits the user's local position and her/his point of view direction, to correctly set the POI height in the user's view avoiding overlapping and cluttering, together with an adaptive rendering method, using information about the brightness of the area, that computes the visual appearance parameters of each virtual POI in order to improve its readability over the background.},
keywords = {Augmented Reality, Context awareness, Cultural heritage, Ego-Vision, Visualization},
pubstate = {published},
tppubtype = {inproceedings}
}
Brancati, Nadia; Caggianese, Giuseppe; Frucci, Maria; Gallo, Luigi; Neroni, Pietro
Touchless Target Selection Techniques for Wearable Augmented Reality Systems Proceedings Article
In: Damiani, Ernesto; Howlett, Robert J.; Jain, Lakhmi C.; Gallo, Luigi; Pietro, Giuseppe De (Ed.): Intelligent Interactive Multimedia Systems and Services, pp. 1–9, Springer International Publishing Switzerland, Sorrento, Italy, 2015, ISBN: 978-3-319-19829-3 978-3-319-19830-9.
Abstract | Links | BibTeX | Tags: Air tap, Augmented Reality, Thumb trigger, Touchless interaction, Wait to click
@inproceedings{brancatiTouchlessTargetSelection2015,
title = {Touchless Target Selection Techniques for Wearable Augmented Reality Systems},
author = { Nadia Brancati and Giuseppe Caggianese and Maria Frucci and Luigi Gallo and Pietro Neroni},
editor = { Ernesto Damiani and Robert J. Howlett and Lakhmi C. Jain and Luigi Gallo and Giuseppe De Pietro},
doi = {10.1007/978-3-319-19830-9_1},
isbn = {978-3-319-19829-3 978-3-319-19830-9},
year = {2015},
date = {2015-06-01},
urldate = {2016-12-06},
booktitle = {Intelligent Interactive Multimedia Systems and Services},
volume = {40},
pages = {1--9},
publisher = {Springer International Publishing Switzerland},
address = {Sorrento, Italy},
series = {Smart Innovation, Systems and Technologies},
abstract = {The paper deals with target selection techniques for wearable augmented reality systems. In particular, we focus on the three techniques most commonly used in distant freehand pointing and clicking on large displays: wait to click, air tap and thumb trigger. The paper details the design of the techniques for a touchless augmented reality interface and provides the results of a preliminary usability evaluation carried out in out-of-lab settings.},
keywords = {Air tap, Augmented Reality, Thumb trigger, Touchless interaction, Wait to click},
pubstate = {published},
tppubtype = {inproceedings}
}
Brancati, Nadia; Caggianese, Giuseppe; Pietro, Giuseppe De; Frucci, Maria; Gallo, Luigi; Neroni, Pietro
Tecnologie Indossabili di Realt`a Virtuale e Aumentata per la Fruizione Interattiva del Patrimonio Culturale Proceedings Article
In: Chianese, Angelo; Bifulco, Francesco (Ed.): Proceedings del Workshop LOSAI Laboratori Open su Scienza Arte e Innovazione, pp. 49–60, Napoli, Italy, 2015, ISBN: 978-88-99130-20-6.
Abstract | BibTeX | Tags: Augmented Reality, Ego-Vision, Touchless interaction, Virtual Reality, Wearable augmented reality
@inproceedings{brancatiTecnologieIndossabiliDi2015,
title = {Tecnologie Indossabili di Realt`a Virtuale e Aumentata per la Fruizione Interattiva del Patrimonio Culturale},
author = { Nadia Brancati and Giuseppe Caggianese and Giuseppe De Pietro and Maria Frucci and Luigi Gallo and Pietro Neroni},
editor = { Angelo Chianese and Francesco Bifulco},
isbn = {978-88-99130-20-6},
year = {2015},
date = {2015-05-01},
booktitle = {Proceedings del Workshop LOSAI Laboratori Open su Scienza Arte e Innovazione},
pages = {49--60},
address = {Napoli, Italy},
abstract = {La fruizione del patrimonio culturale, tangibile e intangibile, `e oggi in forte evoluzione. Il visitatore pu`o non solo guardare le opere, ma interagire, richiedere informazioni aggiuntive su ci`o che vede, inquadrare l'opera nel contesto socio-culturale. Le tecnologie di realt`a aumentata e virtuale stanno diventando sempre pi`u un valido strumento per rispondere a queste esigenze. Tuttavia, affinché tali tecnologie divengano un elemento efficace per la fruizione del patrimonio culturale, devono poter essere semplici da usare, non ingombranti e poter supportare il visitatore, fornendogli informazioni in qualsiasi luogo, outdoor (e.g., siti turistici, piazze), o indoor (e.g., musei, chiese). A valle di una panoramica delle nuove tecnologie e delle loro potenzialit`a, in questo articolo viene presentato un sistema prototipale che, tramite un dispositivo indossabile di realt`a aumentata adatto all'utilizzo sia indoor che outdoor, permette di interagire mediante comandi gestuali con le informazioni proiettate nel campo visivo dell'utente.},
keywords = {Augmented Reality, Ego-Vision, Touchless interaction, Virtual Reality, Wearable augmented reality},
pubstate = {published},
tppubtype = {inproceedings}
}
2014
Caggianese, Giuseppe; Neroni, Pietro; Gallo, Luigi
Natural Interaction and Wearable Augmented Reality for the Enjoyment of the Cultural Heritage in Outdoor Conditions Proceedings Article
In: International Conference on Augmented and Virtual Reality (AVR 2014), pp. 267–282, Springer International Publishing Switzerland, Lecce, Italy, 2014.
Abstract | Links | BibTeX | Tags: Augmented Reality, Cultural heritage, Natural User Interfaces
@inproceedings{caggianeseNaturalInteractionWearable2014,
title = {Natural Interaction and Wearable Augmented Reality for the Enjoyment of the Cultural Heritage in Outdoor Conditions},
author = { Giuseppe Caggianese and Pietro Neroni and Luigi Gallo},
doi = {10.1007/978-3-319-13969-2_20},
year = {2014},
date = {2014-09-01},
urldate = {2016-12-06},
booktitle = {International Conference on Augmented and Virtual Reality (AVR 2014)},
volume = {8853},
pages = {267--282},
publisher = {Springer International Publishing Switzerland},
address = {Lecce, Italy},
series = {Lecture Notes in Computer Science (LNCS)},
abstract = {In this paper, a first prototype of a wearable, interactive augmented reality (AR) system for the enjoyment of the cultural heritage in outdoor environments, is presented. By using a binocular see-through display and a time-of-flight (ToF) depth sensor, the system provides the users with a visual augmentation of their surroundings and with touchless interaction techniques to interact with synthetic elements overlapping with the real world. The papers describes the hardware and software system components, and details the interface specifically designed for a socially acceptable cultural heritage exploration. Furthermore, the paper discusses the lesson learned from the first public presentation of the prototype we have carried out in Naples, Italy.},
keywords = {Augmented Reality, Cultural heritage, Natural User Interfaces},
pubstate = {published},
tppubtype = {inproceedings}
}
2011
Placitelli, Alessio Pierluigi; Gallo, Luigi
Low-Cost Augmented Reality Systems via 3D Point Cloud Sensors Proceedings Article
In: SITIS '11: Proceedings of the 7th International Conference on Signal Image Technology & Internet Based Systems, pp. 188–192, IEEE Computer Society, Dijon - France, 2011, ISBN: 978-0-7695-4635-3.
Abstract | Links | BibTeX | Tags: Augmented Reality, Point cloud
@inproceedings{placitelliLowCostAugmentedReality2011,
title = {Low-Cost Augmented Reality Systems via 3D Point Cloud Sensors},
author = { Alessio Pierluigi Placitelli and Luigi Gallo},
doi = {10.1109/SITIS.2011.43},
isbn = {978-0-7695-4635-3},
year = {2011},
date = {2011-12-01},
booktitle = {SITIS '11: Proceedings of the 7th International Conference on Signal Image Technology & Internet Based Systems},
pages = {188--192},
publisher = {IEEE Computer Society},
address = {Dijon - France},
abstract = {In this paper, we explore the use of widely available and low-priced 3D point cloud sensors, such as the Microsoft XBox Kinecttexttrademark and Asus Xtion PRO LIVEtexttrademark, in the application of computer-generated imagery in live-video streams in Augmented Reality (AR) systems. Specifically, we examine the typical pipeline of AR applications and explore the potential simplifications derived from the use of such devices during the calibration and registration steps, which are the most computationally expensive and time consuming. Moreover, we describe how to approach the problem of face alignment, that is the aligning of a previously captured model of a face to newly captured data, by using 3D point cloud data and open-source libraries.},
keywords = {Augmented Reality, Point cloud},
pubstate = {published},
tppubtype = {inproceedings}
}
Placitelli, Alessio Pierluigi; Gallo, Luigi
3D Point Cloud Sensors for Low-cost Medical In-situ Visualization Proceedings Article
In: 2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW), pp. 596–597, IEEE, Atlanta, GA, USA, 2011, ISBN: 978-1-4577-1613-3.
Abstract | Links | BibTeX | Tags: 3D registration, Augmented Reality, Healthcare, Kinect
@inproceedings{placitelli3DPointCloud2011,
title = {3D Point Cloud Sensors for Low-cost Medical In-situ Visualization},
author = { Alessio Pierluigi Placitelli and Luigi Gallo},
doi = {10.1109/BIBMW.2011.6112435},
isbn = {978-1-4577-1613-3},
year = {2011},
date = {2011-11-01},
booktitle = {2011 IEEE International Conference on Bioinformatics and Biomedicine Workshops (BIBMW)},
pages = {596--597},
publisher = {IEEE},
address = {Atlanta, GA, USA},
abstract = {Medical in-situ visualization deals with the display of the patient's specific imaging data at the location where they actually are. To be effective, it requires high end I/O devices, and computationally expensive and time-consuming calibration and registration steps. In this paper, we explore the use of widely available and low-priced 3D point cloud sensors in medical augmented reality (AR) applications. Specifically, we examine the typical pipeline of AR applications and explore the potential simplifications derived from the use of RGB-D cameras during the calibration and registration steps. Moreover, we describe a low-cost system built from open-source components that takes advantage of 3D point cloud data to apply medical imagery to live-video streams of patients.},
keywords = {3D registration, Augmented Reality, Healthcare, Kinect},
pubstate = {published},
tppubtype = {inproceedings}
}
Other
Talks
Keynote Talks
- 2023, October 10 – Artificial Intelligence and Virtual Reality Advances and Applications in Research Oncology and Clinical Oncology, Next Oncology 2023, Next Oncology – Supporting Oncology through innovation, Milano, Italy. Website
- 2022, March 1 – Touchless interactions in Surgery, ICCI 2022, International Conference on Cybernetics and Innovations, Ratchaburi, Thailand. Website
- 2019, June 25 – Interactive Virtual Environments: From the Laboratory to the Field, SalentoAVR 2019, 6th International Conference on Augmented and Virtual Reality, Lecce, Italy. Website
- 2016, October 2 – Vision-based human-computer interaction in the operating theatre, PRIP 2016, 13th International Conference on Pattern Recognition and Information Processing, Minsk, Republic of Belarus. Website
- 2015, September 1 – Touchless Interaction in Surgery: the Medical Imaging Toolkit experience, SalentoAVR 2015, 2nd International Conference on Augmented and Virtual Reality, Lecce, Italy. Website
Invited Talks
- 2023, November 14 – Realtà Aumentata e Virtuale, La Scienza che non c’era: L’informatica e i prossimi 100 anni del CNR, Area di Ricerca CNR Pisa, Pisa, Italy. Website
- 2023, September 30 – Lessons Learnt from the SMART BEAR Project, STRESS Congress, Palatul Parlamentului, Bucarest, Romania. Host: Prof. Luiza Spiru.
- 2023, May 2 – Enhancing and promoting tangible and intangible cultural heritage: an HCI perspective, PhD Course in “Umanesimo e Tecnologie”, Università di Macerata, Italy. Host: Prof. Roberto Lambertini.
- 2022, December 19 – Sperimentare la realtà estesa e l’interazione uomo-macchina nella riabilitazione personalizzata, Personalized Rehabilitation: Combining Mind, Body and Genetics 2022, Università degli Studi della Campania Luigi Vanvitelli, Napoli, Italia.
- 2022, October 25 – Home Care and Assisted Living for the Elderly: The SMART BEAR Approach, Digital Transformation Summit 2022, Funchal, Madeira, Portugal. Website
- 2022, September 9 – Enhancing and promoting tangible and intangible cultural heritage: an HCI perspective, eXtended Reality and Artificial Intelligence, International Summer School 2022 on “XR and AI for enhancing cultural and territorial heritage”, Matera, Italy. Host: Prof. Ugo Erra. Website
- 2018, December 14 – The potential of virtual reality in various health care settings: promises and challenges, Artificial Intelligence and Health, Rome, Italy. Host: Prof. Clara Balsano. Website
- 2018, November 15 – Piattaforma DatabencArt – progetto per le scuole della Campania, XXII Edizione della Borsa Mediterranea del Turismo Archeologico, Paestum (SA), Italy. Host: Luisa Franzese. Website
- 2018, September 29 – La realtà virtuale per il training cognitivo, Convention Realtà virtuale in soggetti con impairment cognitivo, Gallarate (VA), Italia. Host: Dr. Marco Predazzi. Website
- 2017, February 15 – Interactive ICT technologies for Cultural Heritage, Museo Archeologico Nazionale di Napoli (MANN), Napoli, Italy. Host: Dr. Paolo Giulierini. Website
- 2014, June 6 – Interfacce gestuali touchless per la visualizzazione di immagini mediche, Chirurgie 2014 – Simulazione nella Formazione, Programmazione e Ricerca in Chirurgia, Napoli, Italy. Host: Prof. Marco De Fazio. Website
- 2013, November 11 – Multi-DOF touchless interaction with 3D medical data, Body Tracking in Healthcare 2013, Microsoft Research, Cambridge, UK. Organisers: Abigail Sellen, Kenton O’Hara, Scarlet Schwiderski-Grosche. Website
- 2008, March – Realistic vs. magic interaction metaphors in virtual environments, Multimedia Techniques for Device and Ambient Intelligence – MTDAI 2008, Mogliano Veneto, Italy. Host: Prof. E. Damiani.
Awards
Best Research Paper Award @ AIxPAC
Best Research Paper Award @ ICT4AWE
Second Place Award - Gesture Demonstration Competition @ ICPR
L. Gallo, A.P. Placitelli, G. De Pietro, “A Kinect NUI for 3D Medical Visualization”, demonstrated at the CHALEARN Gesture Demonstration Competition, in conjunction with the 21st International Conference on Pattern Recognition (ICPR 2012), Tsukuba International Congress Center, Tsukuba Science City, Japan, November 10-11, 2012. Organizers: Isabelle Guyon, Vassilis Athitsos. Judges: Alex Balan, Hugo Jair Escalante, Paul Doliotis, Jeffrey Margolis. ChaLearn Gesture Demonstration Competition website
Best Research Paper Award @ IIMSS
L. Gallo, “A Glove-Based Interface for 3D Medical Image Visualization”, presented at the 3rd International Symposium on Intelligent and Interactive Multimedia: Systems and Services, Baltimore, USA, 28-30 July 2010.
Master Degree & PhD Theses
L. Gallo, “Semi-immersive interactive virtual environments for 3D medical imaging,” University of Naples “Parthenope”, Philosophiæ Doctor degree in Information Engineering, 2010. Tutor: Prof. Luigi Romano. Ph.D. Thesis
L. Gallo, “Distribuzione ed esecuzione automatica di task in griglie pervasive,” University of Naples “Federico II”, Master of Science in Computer Engineering, 2006. Supervisors: Antonio Coronato, Giuseppe De Pietro. M.D. Thesis