First discussions about the project. We exchange resources and ideas about the overall goal of the project.
(added on )For the midterm deliverables, the arhitecture of the application was established and presented in the following technical diagrams https://github.com/FiiGezr/architecture . We also created a report that focuses on presenting the preliminary considerations about the internal data structures/models to be used and the external APIs managed by the application.
(added on )We are required to develop a conceptual model in order to help Santa Claus in choosing the right gifts for every child. We need to take into account hobbies, interests but also weather and road conditions and even pandemic related regulations.
(added on )After a period of research on the topic, we meet and discuss further about the project. We set our eyes on Google's Mediapipe project as a mean to detect hand gestures. We delegate tasks for research and/or development.
(added on )We have a working module for hand gesture detection using Google's Mediapipe project. Work is also started towards the server and conceptual model.
(added on )Support for more gestures has been added. The first version of the conceptual model is created with the Owlready2 Python module. An integrating server prototype is almost ready.
(added on )The first server version integrating all components is ready. Work starts on interactive components (quiz game).
(added on )The quiz game component is brought to a working state. Work starts towards a stats feature. A new branch for the server component is created that will use socket streaming from the browser.
(added on )The interactive components are further fleshed out. Gesture detection has been integrated into the socket streaming server.
(added on )