Human Generated Data

Title

Untitled (girls racing on hoby horses at a Ball)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5665

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (girls racing on hoby horses at a Ball)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.5
Person 99.5
Person 99.5
Person 99.2
Person 98.7
Person 96.8
Clothing 91.5
Apparel 91.5
Shoe 86.9
Footwear 86.9
Indoors 84.2
Interior Design 84.2
Person 78.6
Leisure Activities 78.4
Dance Pose 78.4
Person 77.7
Crowd 76.4
People 71.7
Person 67.3
Text 63.6
Food 63.5
Meal 63.5
Pants 61.4
Person 60.1
Female 59.9
Floor 59.3
Suit 58
Coat 58
Overcoat 58
Theme Park 57.6
Amusement Park 57.6
Room 56.6
Tuxedo 56.3
Carousel 55.3

Imagga
created on 2021-12-15

silhouette 29.8
people 22.9
man 22.8
person 17.2
ice 16
grunge 15.3
facility 14.7
business 14.6
sport 14.5
black 14.4
gymnasium 13.8
men 13.7
group 13.7
drawing 13.6
city 13.3
play 12.9
male 12.8
party 12
art 12
team 11.6
businessman 11.5
symbol 11.4
urban 11.4
design 11.2
fun 11.2
active 10.8
game 10.7
silhouettes 10.7
crowd 10.6
reflection 10.5
athletic facility 10.3
dance 9.8
sketch 9.8
human 9.7
women 9.5
motion 9.4
adult 9.3
figure 9.2
fashion 9
player 8.8
boy 8.7
move 8.6
youth 8.5
poster 8.5
speed 8.2
competition 8.2
suit 8.1
activity 8.1
shadow 8.1
rush 7.9
portrait 7.8
travel 7.7
outdoor 7.6
creation 7.6
outline 7.6
power 7.6
speedway 7.5
style 7.4
action 7.4
paint 7.2
ball 7.2
celebration 7.2
transportation 7.2

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99
footwear 94.3
person 74.8
group 56.6
ice skating 55

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Female, 53.5%
Angry 33.9%
Calm 17.2%
Sad 12.7%
Confused 12.4%
Disgusted 8.3%
Happy 6.3%
Surprised 4.8%
Fear 4.4%

AWS Rekognition

Age 36-54
Gender Female, 68.8%
Calm 46%
Happy 30.5%
Sad 16.5%
Confused 2.5%
Angry 1.5%
Surprised 1.1%
Disgusted 1%
Fear 0.9%

AWS Rekognition

Age 20-32
Gender Male, 58%
Calm 48.4%
Happy 23.8%
Sad 5.9%
Fear 5.6%
Surprised 5.6%
Confused 5%
Disgusted 3.3%
Angry 2.4%

AWS Rekognition

Age 39-57
Gender Female, 53.9%
Happy 73.5%
Calm 12.5%
Sad 5.1%
Fear 3.3%
Surprised 2.1%
Angry 1.6%
Confused 1.5%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.5%
Shoe 86.9%

Captions

Microsoft

a group of people posing for a photo 75.1%
a group of people posing for the camera 75%
a group of people posing for a picture 74.9%

Text analysis

Amazon

13867.

Google

13867.
NAGON-YT3RA2-AMT2A3
13867. 13867. NAGON-YT3RA2-AMT2A3