Today, the U.S. Patent and Trademark Office published Apple’s patent application for improvements in locating a user in a home and identifying applications that the user is likely to use in that particular room. You might provide her Apple TV app that turns on the TV when the user walks into the family room, or a garage door opener app when they walk into the garage.
Apple states in the patent background: As modern mobile devices become more and more integrated into modern life, the number of applications stored on mobile devices is increasing. Modern mobile phones often come with hundreds of applications. Mobile devices are especially useful for users due to their large number of applications. However, it can be difficult and time consuming for a user to find and run the desired application among all available applications.
Apple’s invention improves the determination of the user’s location in the home and identifies the user’s applications based on the determined home location.
Applications on mobile devices (e.g., home applications) can be used to control other devices such as accessory devices throughout the home (e.g., kitchen appliances, lights, thermostats, smart door locks, blinds, etc.). The home application user may be in the same room as the controlled accessory device or in a different room from the controlled accessory device. For example, a user may be in the kitchen when she closes the garage door using the Home Her application on her mobile device.
An “accessory device” may be a device located in or near a particular environment, region, or location, such as a home, apartment, or office. Accessory devices include garage doors, door locks, fans, lighting devices (such as lamps), thermometers, windows, blinds, kitchen appliances, and other devices configured to be controlled by your application. home application.
Accessory devices can be determined by the Home application or associated with Home. Accessory devices can be determined, for example, by a mobile device automatically scanning the environment for accessory devices. Alternatively, the user can manually enter the accessory device information, for example through the Home application.
Users often use accessory devices to perform the same or repeated actions when they are in a particular location. For example, every time a user comes home from work, they might close the garage door while in the kitchen. Additionally, if it’s dark outside, users can turn on the living room lamps or change the temperature of the thermostat without leaving the living room. Thus, certain activities for devices in the home may be performed regularly and repeatedly (eg, daily, several times a day) while the user is at a particular location. These tasks are performed periodically or even several times a day and can be time consuming and tedious for users.
Embodiments use sensor measurements to recommend applications and/or accessory devices, or to automate actions by applications based on the application’s usage history at identifiable locations (sometimes referred to as microlocations). To provide an improved mobile device and method for performing
A sensor (e.g., an antenna and associated circuitry) on a mobile device derives sensor values from one or more essentially stationary signal sources, such as wireless signals emitted by wireless routers or network-enabled appliances in the home. can be measured. These sensor values are reproducible at the same physical location of the mobile device, so the sensor values can be used as a proxy for physical location. In this way the sensor values can form the sensor position even in sensor space rather than physical space.
A “sensor location” may be a multidimensional data point defined by a separate sensor value in each dimension. In various embodiments, the parameter of the wireless signal may be a signal characteristic (eg, signal strength or time of flight such as round trip time (RTT)) or other sensor value measured by a sensor of the mobile device. For example, it relates to data transmitted in one or more radio signals.
A “cluster” corresponds to a group of sensor locations (eg, scalar data points, multidimensional data points, etc.) from which measurements were taken. Sensor locations can be determined to be located within clusters according to embodiments described herein. For example, cluster sensor locations can have parameters that are within a threshold distance from each other or from the cluster centroid. When viewed in sensor space, a cluster of sensor locations appears as a group of sensor locations that are close to each other. A cluster of sensor locations can be placed, for example, in a room of the house or in a specific area of the house (for example, hallway, front door area).
Locations within a home or building are sometimes called “microlocations.” Locations are sometimes referred to as microlocations because they refer to specific areas within, for example, a user’s home. Additionally, a place or microlocation is sometimes referred to as a cluster of places. The terms location, microlocation, and cluster of locations may refer to the same area or region. A home may have multiple locations. Locations correspond to rooms in the house or other areas of the house. For example, a location could be a backyard area, entryway area, hallway area, etc.
Apple patent illustration. 1 shows a block diagram of a system for identifying a user’s application based on sensor location; FIG. fig. FIG. 5 shows an example of microlocation using unsupervised machine learning.
Apple patent illustration. 1 shows a simplified block diagram of a semi-supervised machine learning model below. fig. FIG. 8 shows an example of results produced by a semi-supervised machine learning model.
Apple patent illustration. Figure 10B above shows . A simplified block diagram of an application’s prediction system, including an application-specific micro-location machine learning model.
According to some embodiments, an application can automatically generate tagged samples without active user request. For example, when a user opens the front door using the home app on his mobile mobile device while in the driveway (assuming the front door is equipped with a smart lock), the home app will be able to measure the signal value can automatically generate tagged samples. Designate its location and label the tagged sample as “entrance”.
Then, after the machine learning model has been trained, the home application will either offer “open the front door” as a recommendation on the user interface, or open the front door if the machine learning model predicts that the user is in the driveway. (for example, when a machine learning model determines that a data point is “similar” to a cluster of data points associated with a driveway).
As another example, wireless streaming applications can use semi-supervised machine learning models to predict target devices to project video or audio to. After the machine learning model is trained, the wireless streaming application can offer the living room TV as a recommendation if the machine learning model predicts that the user is in the living room.
Apple’s patents cover the following topics:
- Sensor measurements and clusters
- Predicting user interactions with devices
- Training and generation of clusters
- Taking proactive actions based on measured sensor locations and clusters
- Events that trigger predictions
- event detection
- User interaction event
- Device connection event
- Determining trigger events
- Identify applications and take associated actions
- contextual information
- Prediction module for determining recommendations
- Controller update to determine recommendations
- Possible Problems in Microlocation Unsupervised Machine Learning
- Target Prediction Based on Microlocations Using Unsupervised Machine Learning
- Possible problems with microlocation using unsupervised machine learning
- Semi-supervised machine learning for microlocations
- Using both tagged and untagged samples
- A self-training approach for semi-supervised machine learning
- Labeling of tagged samples
- Generating Tagged Samples
- Application specific model
- Application-specific models containing samples tagged with target regions
- Application-specific model with sample tagged actions
- Object prediction system for multiple applications
- Fusion of different sensors
- How to predict target objects for mobile devices
- Location-based and predictive models
- action-based model
Engineers and hobbyists interested in learning more about this invention should check Apple’s patent application number 20230179671.
apple inventor
- Yoav Feinmesser: Wireless Sensing and Location Engineer (Apple Israel)
- Rafi Vitory: Engineering Manager Location Technology (Apple Israel)
- Ron Eyal: Machine Learning Algorithm Engineer (Apple Israel)
- Eyal Waserman: Sensing Algorithm Engineer (Apple, Herzliya District, Israel)
- Yunxing Ye, Ph.D.: Software Engineer
