Abstract of the talk: Due to the increasingly fast-paced life in the city, people have become exhausted. In addition, the increasing severity of aging also reminds society of the need to reduce the difficulty of using electrical appliances. In order to change this situation, smart homes came into being, which can also greatly reduce the energy consumed by users at home. In this work, based on a data set that records sensor data under different activities provided by a mobile phone worn on the waist, an activity recognition tool has been developed to provide a new control strategy for smart homes. This tool classifies user activities including standing, sitting, lying, walking, and going upstairs and downstairs by analyzing data from the sensors. It realizes real-time transmission and updates of human body movement data through the signal of the Micro Controller Unit (MCU). Then we use a customized machine learning algorithm based on experimental data and a method based solely on human movement data to analyze and identify user activity. The main superiority of this system is that the hardware used is simple, efficient, and cost-effective. After evaluating the proposed system we have found that it has obtained more than 85% accurate recognition of human activities. Different from the current mainstream algorithms based solely on machine learning, we have also introduced data related to human kinematics to better fit the training model. For users with different physical conditions, different parameters can be configured for better versatility.
Bio of the presenter: Dr. Pushpendu Kar is an Assistant Professor in the School of Computer Science at the University of Nottingham Ningbo China (China campus of the University of Nottingham UK). Before this, he was a Research Fellow in the Department of ICT and Natural Sciences at Norwegian University of Science and Technology (NTNU), Norway, the Department of Electrical & Computer Engineering at National University of Singapore (NUS) and the Energy Research Institute at Nanyang Technological University (NTU), Singapore. He has completed all his PhD, Masters of Engineering, and Bachelor of Technology in Computer Science and Engineering. He also completed Sun Certified Java Programmer (SCJP) 5.0, one professional course on Hardware & Networking, two professional courses on JAVA-J2EE, Finishing School Program from National Institute of Technology Durgapur, India and UGC sponsored refreshers course from Jadavpur University, India. Dr. Kar went for a research visit to Inria Paris, France. He was awarded the prestigious Erasmus Mundus Postdoctoral Fellowship of European Commission, ERCIM Alain Bensoussan Fellowship of European Union, and SERB OPD Fellowship of Department of Science and Technology, Government of India. He received the 2020 IEEE Systems Journal (2020 I.F.: 4.463) Best Paper Award. This is one of the seven papers out of 793 papers [top 1%] that have received the award. He also received travel grants to attend conferences and doctoral colloquiums. Dr. Kar has more than 10 years of teaching and research experience, including in a couple of highly reputed organizations around the world. He worked as a software professional in IBM for one and a half years. Dr. Kar is the author of more than 30 scholarly research papers, which has published in reputed journals including ACM TAAS, IEEE TNSM, IEEE Systems Journal, IEEE Sensors Journal, Journal of Building and Environment, conferences including ICC, TENCON, IECON, and IT magazines. He is also an inventor of four patents. He has participated in the program committee of several conferences, worked as a team member to organize short term courses, and delivered few invited talks. He is a regular reviewer of IEEE, Elsevier, Wiley, and Springer journals and conferences.
Abstract of the talk: Over the recent years, technological developments made it possible to create immersive experiences that cross the uncanny valley (volumetric capture), engage all senses (haptic, olfactory, and gustatory stimuli), give agency of "film director" to the viewer, and allow interaction with the story elements itself. I call these technologies an "Artificial Reality Continuum". In my talk, I will provide an overview of the technologies involved and discuss examples of my work in this new medium as well as early attempts to understand the differences between traditional storytelling and immersive storytelling.
Bio of the presenter: Dr. Pietroszek is a tenure-track professor in the Film and Media Arts Division of the School of Communication. He is the Founding Director of the Institute for Immersive Designs, Experiences, Applications, and Stories (Institute for IDEAS), an Associate Director of the Center for Environmental Filmmaking, and is also affiliated faculty at the AU Game Center, and the Department of Computer Science. Krzysztof teaches immersive filmmaking courses in the undergraduate and graduate film programs and game development courses in graduate game programs.
Research and Contribution: Dr. Pietroszek research interests include the application of machine learning to media, designing volumetric cameras and telemedicine equipment, and developing technologies related to creating films and interacting in virtual and augmented reality.
He is also a filmmaker, who produced an award-winning feature film ("Waiting for Summer"), wrote and directed six short fiction and documentary films ("Private Apocalypse of Tim", "Agape", "Daniel", "Eve", "Greenscreen", and "Vera"), and created a transmedia mixed-reality experience ("Vera").
He developed several VR video games ("Cube VR", "Chessnaught VR") and published over 40 peer-reviewed research papers. Dr. Pietroszek's research is supported by the National Science Foundation, Canada Media Fund, private industry partners, and other federal funding agencies. To date, Krzysztof raised nearly 3,000,000 USD in various funding.