Facilitate Our Personal and Community Connection

2023 Capstone Design & Innovation Day

Project AI-15: 2-D Vibration Detection of Guitar Strings Using Piezo Sensors

Project Client: Yamaha Guitar Group, Inc

Project Description: Global guitar sales totaled $17.2 billion in 2020, representing a sizeable market in the modern music industry [1].Electric guitars make up over 58% of these sales as attempts to modernize this classical pedals, which use digital signal processing to change the quality art expression increase. With an expected growth of 17% through 2023, there is an expanding market for electric guitar companies to create new novel sounds, much like ones introduced by high-gain amps and of the sound [2].

Guitars rely on the resonating vibrations of tensioned strings to produce audible output. These strings mainly support two modes of vibrating in all three major axes. Transverse waves produce displacements in two major axes perpendicular to the string, while longitudinal waves produce pressure differences in the axes parallel to the string. Both transverse and longitudinal waves travel in the direction parallel to the string. Understanding all modes of vibration and developing methods of measurement are crucial to capture the complete sound profile generated by the string. Currently, our clients have piezoelectric guitar bridges that can sense vibration in one dimension perpendicular to the string. Since strings vibrate in all three dimensions, this results in a loss of information and, consequently, an incomplete reproduction of the original sound profile.

In addition to the limitations imposed from a technical standpoint, the sound effects created by different string movement methods that enhance the guitar playing experience are also restricted by the 1D sensing. Plucking (with a pick) and strumming (with fingers) inherently produce different sound waves in different dimensions, not all of which can be captured solely through traverse vibrations. Slapping and popping the guitar is when the player strikes the string against the instrument’s surface, which generates a percussive sound, again an instance where the capture of 1D motion is not enough to capture the movement of the string profile. Finally, our clients have existing guitars that alter the sound profile to other instruments like banjo or sitars, and these sound effects will only improve with the additional data provided by the second dimension.

Our project goal is to design a system that can capture the 2D motion perpendicular to the string, giving our client a complete picture of the transverse wave behaviour of a particular string. To do this, we will be designing a prototype sensor system (herein referred to as the “system”) as well as a brief outline of our research on various sensor technology. Not only will our project provide a more accurate reproduction of the guitar sound profile, but it will serve to enhance the future sound effects Yamaha and Line6 create by providing second-dimension data. Line 6 can then use our prototype and research as a jumping-off point for more in-depth research into a new type of product.


Project CG-42: 3D Mapping Smart Glasses for people with vision loss

Project Client: Seleste Innovations Inc

Project Description: Seleste has created glasses leveraging machine learning and cameras to identify and describe objects. This aligns with their vision of allowing visually impaired users to regain their freedom and independence. Seleste’s vision extends beyond object identification; they aim to develop indoor navigation capabilities for their users, by generating a 3D mapping of a space, localization of the user and directional audio navigation.

To support Seleste’s vision, our capstone project focused on addressing the technical challenge of navigation by identifying the optimal methods for collecting 3D maps and evaluating the feasibility of using different hardware and software technologies. Our investigation aimed to contribute to the expansion of Seleste’s glasses’ functionality and empower visually impaired individuals with greater independence and freedom.

If you are interested in learning more about the project or contacting the Seleste team, please visit their website or reach out to us via email, which is listed below:

Cody Li: codyli@student.ubc.ca ; Connor Bews: connbews@telus.net ; Bolong Tan: 282170074@qq.com ; Nick Hamilton: nick.hamilton@shaw.ca ; Jonathan Zhang: zhanghaosen328@gmail.com 


Project PB-30: Chat Bot & Recommendation Engine

Project Client: UBC Cloud Innovation Centre

Project Description: Our team worked with UBC’s Cloud and Innovation Center to create a proof of concept for a chatbot that facilitates student engagement with the university. The current academic advising system at UBC is flawed with long wait times and limited advisory availability. Our capstone project aims to provide a simple and intuitive interface that students can use as an alternative resource when trying to find answers for their course or degree related questions.

The UBC student assistance bot is a chatbot that is built using AWS resources to provide students with a robust, maintainable, and scalable solution that is available for all hours of the day. Our team leveraged conversational AI and natural language processing tools to provide the core functionality for the chatbot. These tools allow our bot to conduct coherent conversations with the user and provide appropriate answers to a variety of different questions. The underlying data that the chatbot uses to answer student questions is aggregated from a number of different UBC resources and is periodically updated to ensure correctness. While the main focus of the chatbot is to answer student questions, our team also implemented the ability for the chatbot to receive feedback from users that can be viewed by system administrators. This will allow them to make improvements if necessary. Additionally, the deployment of the chatbot system has been automated, which allows future developers to easily update the chatbot by widening the scope of data or improving the AI interface.


Project PN-13: Gesture Controlled Drone Vlogging Assistant

Project Client: Huawei Technologies Canada Inc.

Project Description: Homemade vlogs have consistently been popular in the online space. However, being both the cameraperson and the subject can be challenging for an individual, as they would need to manage the camera while authentically experiencing the moment. The Obstacle Avoiding Drone Vlogging Assistant is a convenient tool for vloggers and professional athletes to create high-quality videos without the need for additional support. The drone is designed to autonomously track a person, avoid obstacles, and capture videos from various angles.

The major design contribution of this project is the integration of OpenPose – an offline machine learning model – with a modified artificial potential field obstacle avoidance algorithm, on Huawei’s ATLAS 200DK board.  The ATLAS 200DK board is a high-performance AI application developer board with the Ascend 310 AI processor. Another challenge for this project was using a new, untested drone that required new network connections and new API commands.

For more information about the Gesture Controlled Drone Vlogging Assistant project, or if you have any job opportunities related to AI and machine learning, please feel free to contact the team via our outreach representative at annatqwang@gmail.com


Project PN-32: Innovation Connections – Knowledge Graph

Project Client: UBC Cloud Innovation Centre

Project Description: Large universities such as the University of British Columbia (UBC) attract many researchers and with those researchers comes a large amount of data which can be leveraged to connect researchers with each other and identify how to allocate funding. However, data on researchers is spread over many sources and drawing meaningful conclusions from it can be challenging. The UBC Cloud Innovation Center approached us to develop a tool to visually represent and analyze the data so that universities can improve how they approach research. Our team developed a knowledge graph using React and Amazon Web Services (AWS), where nodes represent researchers and edges denote relationships between them. Key decision makers at universities can use our website to search for researchers, identify their networks, and find potential research opportunities. The entire solution is open source and deployable to the cloud using AWS, allowing universities beyond UBC to leverage this tool for their research endeavors.


Project PN-81: Advanced Video surveillance

This team will not have a booth at the Design & Innovation Day due to confidentiality agreements.

Project Client: TELUS Communications Inc.

Project Description: Have you ever been paranoid about strangers lurking around outside your house? Our machine learning model helps in keeping your home secure by detecting anomalies in your surveillance footage. You’ll be alerted to any suspicious activity in real-time so you can take action quickly. Our advanced algorithms analyze footage for unusual behavior, giving you peace of mind that your home is always being monitored. 

A major technical challenge that was resolved was detecting lesser false positives in the data, and making the algorithm more accurate in detecting anomalies. 

Contact Information: Andrew Shieh: andrewshieh7379@gmail.com ; Bowei Ren: caesarrnn@gmail.com ; Elbert Ng: elbertng25@gmail.com ; Sebastian Gonzalez: seb.gonzalezsg.1999@gmail.com ; Vincent Sastra: vincent.sastra@gmail.com


Project SF-60: Neesh-An LGBTQ2+ Community Mobile Application

Project Client: Qrated Studio Inc.

Project Description: The LGBTQ2+ population in North America is looking for ways to alleviate their anxiety about finding acceptance in themselves, in society, and in the ones who want to support them.

Our client, Neesh, is a startup looking to help solve anxiety problems among the LGBTQ2+ community. Neesh’s solution to such problems is to provide an anonymous online forum that allows queer people to freely speak their minds and find similarities, solutions, and acceptance regarding their issues. Currently, there are few apps that are geared towards non-romantic and/or non-sexual queer audiences, but Neesh aims to fill that gap with their product.

Our team was given the Figma UI designs of the app and designed, built, and tested the Neesh App from scratch implementing key features of the app:

  • User Login / Sign-up with phone number
  • User Feed of list of ‘convos’
  • UI and Animations
  • ‘Convos’
    • Creating ‘convos’ (posts)
    • Associate topics with convos
    • Liking
    • Commenting
    • Replying to comments
  • Explore Page
    • Browse convo by topics
  • Notification
    • Push notification
    • List of all past notifications

Contact Information: Trevor Flanigan : trevorflanigan@alumni.ubc.ca : 604-318-9666 ; Yisheng Liu ; Amir Barkam : amirbarkam6@gmail.com : 778-323-1899 ; Arnold Ying : arnoldying825@gmail.com : 778-887-0856 ; Justin Hua : huajustinh@gmail.com : 778-840-7189


Project SF-64: Remote mobility monitoring system

Project Client: UBC Cloud Innovation Centre

Project Description: The purpose of the project is to create a technology solution that allows caregivers to monitor the mobility metrics of mobility-affected people, such as senior citizens, on a daily basis without the need for synchronous meetings with the patients. The goal is to ultimately help the University of British Columbia Cloud Innovation Centre (CIC) achieve their mission of helping the community through sustainable and repeatable technology solutions. The project outcome, Mobimon, is a complete functionality set that includes an ios app to extract the mobility data from patients, an AWS backend that handles all the processing and a web app to manage and visualize the mobility data. CIC can easily deploy the set for use among nursing homes, hospitals, and other relevant organizations within the community.