Human Generated Data

Title

Untitled (young people dancing)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21581

Human Generated Data

Title

Untitled (young people dancing)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.8
Human 99.8
Person 99.6
Person 99.2
Person 99.2
Leisure Activities 98.7
Dance Pose 98.7
Person 97.5
Shoe 97.5
Footwear 97.5
Clothing 97.5
Apparel 97.5
Floor 97.3
Person 97.3
Shoe 91.1
Person 87.1
Person 87.1
Dress 84.7
Flooring 81.9
Female 80.8
Road 77
Person 76.8
Tarmac 76.6
Asphalt 76.6
Indoors 70
Stage 69
Photo 68.2
Portrait 68.2
Face 68.2
Photography 68.2
Dance 66.1
People 64
Woman 63.7
Shorts 63
Girl 62
Room 61.3
Food 59.6
Meal 59.6
Path 58.3
Home Decor 57.7
Crowd 55.5

Imagga
created on 2022-03-05

person 35.4
dancer 34.9
business 34
man 31
businessman 30.9
people 30.7
performer 29.7
teacher 28.6
professional 27.3
adult 26
corporate 24
group 23.4
office 22.9
male 22.7
meeting 20.7
men 20.6
entertainer 20.3
team 18.8
black 17.5
educator 17.3
success 16.9
executive 16.4
communication 15.9
businesswoman 15.4
sitting 13.7
women 13.4
work 13.3
modern 13.3
happy 13.2
teamwork 13
suit 12.9
room 12.7
businesspeople 12.3
manager 12.1
training 12
board 11.7
conference 11.7
job 11.5
successful 11
lifestyle 10.8
city 10.8
worker 10.7
spectator 10.6
career 10.4
presentation 10.2
photographer 10
attractive 9.8
working 9.7
together 9.6
building 9.1
indoor 9.1
silhouette 9.1
fashion 9
human 9
seminar 8.8
strategy 8.7
standing 8.7
education 8.7
table 8.6
screen 8.4
portrait 8.4
clothing 8.3
classroom 8.2
computer 8.2
dance 8.2
style 8.2
chair 8.1
urban 7.9
employee 7.8
laptop 7.8
company 7.4
competition 7.3
exercise 7.3
dress 7.2
looking 7.2
hall 7.2
idea 7.1
posing 7.1
interior 7.1
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 99.1
dance 95.3
clothing 92.4
floor 91.3
footwear 89.3
woman 77.8
black and white 75.6
group 74.3
sport 68.9
dancer 62.7
people 57.9

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Female, 87.5%
Surprised 97.6%
Calm 1.4%
Sad 0.6%
Confused 0.1%
Fear 0.1%
Disgusted 0.1%
Happy 0.1%
Angry 0%

AWS Rekognition

Age 33-41
Gender Male, 51.4%
Surprised 59.4%
Calm 27.5%
Sad 4.2%
Angry 3.4%
Disgusted 2.2%
Fear 1.3%
Confused 1%
Happy 1%

AWS Rekognition

Age 40-48
Gender Female, 83%
Happy 36.8%
Sad 36.4%
Confused 8.7%
Calm 7.6%
Disgusted 3.5%
Angry 2.6%
Fear 2.6%
Surprised 1.7%

AWS Rekognition

Age 18-26
Gender Female, 57.6%
Calm 91.5%
Sad 3.1%
Happy 3.1%
Confused 0.7%
Fear 0.6%
Angry 0.6%
Disgusted 0.3%
Surprised 0.2%

AWS Rekognition

Age 24-34
Gender Female, 68.6%
Calm 83.8%
Happy 7.1%
Fear 4.5%
Sad 2.8%
Disgusted 0.8%
Confused 0.5%
Angry 0.3%
Surprised 0.2%

Feature analysis

Amazon

Person 99.8%
Shoe 97.5%

Captions

Microsoft

a group of people standing in front of a crowd 92.4%
a group of people in a room 92.3%
a group of people standing in a room 92.2%

Text analysis

Amazon

MJI7-
MJI7- YT37A'S
YT37A'S