Human Generated Data

Title

Untitled (group of children seated around tables, Methodist Church)

Date

c. 1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2762

Human Generated Data

Title

Untitled (group of children seated around tables, Methodist Church)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Chair 99.6
Furniture 99.6
Person 99.1
Human 99.1
Person 99
Person 98.9
Interior Design 98.6
Indoors 98.6
Chair 98.5
Person 96.6
Room 96.6
Person 96.2
Person 94.6
Classroom 94.3
School 94.3
Person 94.1
Person 93.2
Person 91.4
Person 88.9
Person 88.9
Crowd 87.7
Person 86
Person 85.7
Person 84
Ceiling Fan 82.5
Appliance 82.5
Table 81.1
Person 80.4
Leisure Activities 78
Musical Instrument 73.7
Musician 73.7
People 71.8
Audience 68.7
Dining Table 63.3
Person 61.3
Person 60.8
Stage 58.1
Music Band 57.2
Overcoat 55.5
Suit 55.5
Coat 55.5
Apparel 55.5
Clothing 55.5

Imagga
created on 2022-01-16

classroom 100
room 82.5
people 36.8
person 32.6
teacher 28
man 27.5
group 25.8
businessman 25.6
business 25.5
male 24.8
men 24
adult 23.2
professional 22.2
women 21.3
meeting 20.7
office 20
sitting 19.7
smiling 19.5
table 19
student 18.6
happy 17.5
education 17.3
modern 16.8
corporate 16.3
work 15.8
together 15.8
team 15.2
school 15
indoors 14.9
teamwork 14.8
couple 14.8
restaurant 14.7
businesswoman 14.5
interior 14.1
colleagues 13.6
chair 13.5
educator 13.5
communication 13.4
executive 13.3
job 13.3
portrait 12.9
hall 12.4
lifestyle 12.3
life 12
indoor 11.9
blackboard 11.6
talking 11.4
desk 11.3
worker 11.1
students 10.7
businesspeople 10.4
mature 10.2
smile 10
human 9.7
teaching 9.7
success 9.6
class 9.6
building 9.6
home 9.6
drinking 9.6
two 9.3
board 9
suit 9
cheerful 8.9
30s 8.6
happiness 8.6
casual 8.5
friends 8.4
design 8.4
senior 8.4
eating 8.4
manager 8.4
occupation 8.2
girls 8.2
standing 7.8
businessmen 7.8
40s 7.8
corporation 7.7
studying 7.7
gesture 7.6
finance 7.6
learning 7.5
friendship 7.5
study 7.5
presentation 7.4
holding 7.4
employee 7.4
successful 7.3
friendly 7.3
children 7.3

Google
created on 2022-01-16

Photograph 94.1
Black 89.6
Ceiling fan 88.5
Chair 86.7
Table 84.8
Black-and-white 84.4
Style 83.9
Art 77.9
Monochrome 76.6
Monochrome photography 75.5
Snapshot 74.3
Suit 73.3
Event 72.5
Mechanical fan 72.1
Building 71.1
Room 70.9
Music 69.1
Painting 68.3
Font 66
Class 64.8

Microsoft
created on 2022-01-16

person 97.8
text 89.5
clothing 86.2
group 82.9
furniture 77.7
people 70.6
table 54
posing 43.5

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 94.2%
Happy 73.6%
Calm 11.2%
Surprised 10.6%
Disgusted 1.4%
Fear 1%
Angry 0.8%
Confused 0.7%
Sad 0.6%

AWS Rekognition

Age 35-43
Gender Male, 94.7%
Calm 66.4%
Sad 18.6%
Happy 10.3%
Disgusted 1.6%
Fear 1.2%
Confused 0.8%
Surprised 0.5%
Angry 0.5%

AWS Rekognition

Age 39-47
Gender Male, 98%
Calm 87.5%
Happy 5%
Confused 1.9%
Sad 1.8%
Angry 1.7%
Disgusted 1%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 26-36
Gender Male, 99.9%
Calm 62.9%
Sad 31.9%
Confused 2.4%
Happy 0.9%
Disgusted 0.9%
Angry 0.7%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 48-54
Gender Male, 91.9%
Happy 91.8%
Calm 4%
Sad 2%
Confused 0.6%
Surprised 0.5%
Disgusted 0.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 40-48
Gender Male, 99%
Calm 97.9%
Sad 0.8%
Happy 0.3%
Disgusted 0.3%
Surprised 0.2%
Confused 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Female, 92%
Calm 92.1%
Sad 5.8%
Surprised 1.3%
Happy 0.2%
Confused 0.2%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 99%
Calm 94%
Angry 1.9%
Sad 1.5%
Disgusted 1.3%
Surprised 0.5%
Confused 0.4%
Happy 0.3%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 73.8%
Calm 97.3%
Sad 1.7%
Surprised 0.3%
Confused 0.2%
Happy 0.2%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 36-44
Gender Female, 85.4%
Happy 62.4%
Calm 19.4%
Sad 7.5%
Angry 2.9%
Confused 2.4%
Fear 2.2%
Disgusted 2.1%
Surprised 1.1%

AWS Rekognition

Age 33-41
Gender Female, 98.4%
Sad 51.4%
Happy 25.3%
Calm 13.2%
Confused 3.8%
Fear 2.8%
Surprised 1.4%
Disgusted 1.2%
Angry 0.9%

AWS Rekognition

Age 27-37
Gender Male, 95.4%
Calm 99.8%
Sad 0.1%
Disgusted 0%
Surprised 0%
Angry 0%
Confused 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Female, 51%
Sad 88.1%
Happy 6.5%
Calm 3.6%
Disgusted 0.6%
Fear 0.5%
Angry 0.4%
Surprised 0.3%
Confused 0.1%

AWS Rekognition

Age 33-41
Gender Male, 85.7%
Calm 96.8%
Sad 1.4%
Happy 0.6%
Confused 0.5%
Angry 0.2%
Fear 0.2%
Surprised 0.2%
Disgusted 0.2%

AWS Rekognition

Age 18-26
Gender Female, 69.5%
Sad 93.8%
Calm 4.7%
Confused 0.5%
Fear 0.5%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Chair 99.6%
Person 99.1%
Ceiling Fan 82.5%

Captions

Microsoft

a group of people standing in front of a building 94.9%
a group of people standing in front of a window 90.5%
a group of people standing in front of a store 89.1%

Text analysis

Amazon

5
RODAK
AARAND