Human Generated Data

Title

Untitled (graduation class, Immaculate Seminary)

Date

1936

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22339

Human Generated Data

Title

Untitled (graduation class, Immaculate Seminary)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

1936

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 98.8
Human 98.8
Person 97.2
Person 97
Person 97
Person 95.2
Person 94.3
Person 94.2
Person 93
School 92.1
Person 90.3
Person 90
Indoors 89.2
Room 89.2
Classroom 89.2
Clothing 88.8
Apparel 88.8
Person 87.8
Person 84.4
Person 83
Person 82.2
Person 79.4
Person 77.8
Graduation 77.4
People 76.3
Sailor Suit 74.4
Person 68.1
Person 67
Person 64.3
Dress 64.2
Crowd 63.9
Person 61.9
Person 46.1

Imagga
created on 2022-03-11

crowd 29.7
silhouette 29
people 27.3
mortarboard 23.3
teamwork 22.2
outfit 21.5
clothing 21.2
man 20.9
cap 19.9
business 18.8
team 18.8
group 18.5
person 18
male 17.7
businessman 17.6
audience 16.6
work 16.5
businesswoman 16.3
headdress 14.2
presentation 13.9
occupation 13.7
job 13.3
hall 13.1
men 12.9
cheering 12.7
boss 12.4
flag 12.2
vivid 12.1
uniform 12
supporters 11.8
president 11.8
nighttime 11.7
speech 11.7
covering 11.7
stadium 11.7
leader 11.6
patriotic 11.5
classroom 11.5
room 11.5
vibrant 11.4
nation 11.3
design 11.2
black 11.1
lights 11.1
women 11.1
consumer goods 11
travel 10.6
walking 10.4
icon 10.3
sexy 9.6
symbol 9.4
military uniform 9.4
bright 9.3
spectator 9.2
academic gown 9.2
adult 8.7
gown 7.5
friendship 7.5
city 7.5
document 7.4
student 7.4
light 7.3
success 7.2
world 7.2
rural 7
together 7

Google
created on 2022-03-11

Crew 73.5
Event 73.3
Team 73.3
Monochrome 71.5
Monochrome photography 69.6
Art 66.9
Crowd 66.2
Uniform 65.4
History 64.6
Font 63
Troop 59.2
Military organization 54.9
Room 52.7
Photography 52.1

Microsoft
created on 2022-03-11

person 95.5
clothing 93.3
outdoor 92.4
group 81.5
christmas tree 77.2
white 75.7
black and white 72.6
text 70.7
black 68.3
people 61.6
old 56.2
clothes 15

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 95.2%
Calm 42.8%
Sad 41.9%
Happy 8%
Surprised 2.2%
Confused 1.9%
Fear 1.5%
Disgusted 1.2%
Angry 0.5%

AWS Rekognition

Age 25-35
Gender Male, 77.7%
Calm 54.4%
Happy 19.5%
Sad 18.9%
Surprised 3.5%
Fear 1.5%
Confused 1%
Disgusted 0.8%
Angry 0.4%

AWS Rekognition

Age 39-47
Gender Male, 79.3%
Calm 95.4%
Surprised 1.5%
Sad 1.2%
Happy 1%
Angry 0.3%
Confused 0.2%
Disgusted 0.2%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Male, 92.2%
Calm 97%
Surprised 2.2%
Sad 0.3%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%
Confused 0.1%
Angry 0.1%

AWS Rekognition

Age 26-36
Gender Male, 100%
Calm 95.1%
Sad 1.6%
Happy 1%
Disgusted 1%
Confused 0.8%
Surprised 0.3%
Angry 0.2%
Fear 0.2%

AWS Rekognition

Age 27-37
Gender Male, 92.7%
Calm 96.8%
Disgusted 1.9%
Happy 0.6%
Sad 0.4%
Surprised 0.1%
Confused 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 27-37
Gender Male, 98.3%
Calm 69.2%
Happy 16%
Sad 8.6%
Fear 2.8%
Disgusted 1.2%
Confused 0.8%
Surprised 0.8%
Angry 0.7%

AWS Rekognition

Age 31-41
Gender Female, 52.5%
Happy 87.3%
Calm 8.7%
Sad 1.4%
Surprised 1.3%
Fear 0.4%
Confused 0.3%
Disgusted 0.3%
Angry 0.2%

AWS Rekognition

Age 39-47
Gender Male, 97.5%
Calm 99.2%
Sad 0.4%
Fear 0.2%
Happy 0.1%
Disgusted 0%
Confused 0%
Surprised 0%
Angry 0%

AWS Rekognition

Age 35-43
Gender Male, 97.6%
Calm 75.3%
Happy 23.1%
Sad 0.5%
Surprised 0.3%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 78.1%
Surprised 6.1%
Sad 6.1%
Fear 5%
Disgusted 1.6%
Angry 1.3%
Confused 0.9%
Happy 0.9%

AWS Rekognition

Age 27-37
Gender Male, 92.5%
Calm 96.2%
Fear 0.9%
Confused 0.9%
Happy 0.8%
Sad 0.4%
Disgusted 0.4%
Surprised 0.3%
Angry 0.1%

AWS Rekognition

Age 23-33
Gender Male, 99.5%
Calm 88%
Happy 5.8%
Disgusted 2%
Sad 1.8%
Angry 1.7%
Surprised 0.4%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 34-42
Gender Male, 55.7%
Happy 48.3%
Calm 36.2%
Sad 9.1%
Surprised 1.9%
Disgusted 1.9%
Angry 1.4%
Fear 0.7%
Confused 0.6%

AWS Rekognition

Age 24-34
Gender Female, 60.9%
Calm 98.8%
Happy 0.4%
Surprised 0.3%
Sad 0.2%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 41-49
Gender Male, 84.4%
Calm 68.8%
Happy 23.9%
Disgusted 2.2%
Surprised 1.6%
Sad 1%
Confused 0.9%
Fear 0.9%
Angry 0.8%

AWS Rekognition

Age 24-34
Gender Male, 83.1%
Calm 72%
Happy 13.3%
Sad 7.1%
Confused 4.3%
Disgusted 1%
Fear 0.9%
Surprised 0.9%
Angry 0.6%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 58.1%
Sad 19%
Happy 8.9%
Fear 5%
Confused 3.3%
Surprised 3%
Disgusted 1.6%
Angry 1%

AWS Rekognition

Age 37-45
Gender Female, 51.7%
Happy 95.2%
Calm 3.3%
Surprised 0.7%
Sad 0.2%
Confused 0.2%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people standing in front of a crowd 87.8%
a group of people riding on the back of a horse 81.9%
a group of people standing in front of a building 81.8%

Text analysis

Amazon

93710
A
Immaculata
A 93710 Immaculata Serinery 6.36
Serinery 6.36