Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

Date

1952-1953

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6697

Human Generated Data

Title

Untitled (group portrait of female yearbook staff seated around table with typewriter while looking yearbooks)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6697

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.3
Person 99.2
Person 98.9
Person 98.6
Person 98.1
Person 98
Chair 97.1
Furniture 97.1
Sitting 93.5
Person 90.8
Clothing 79.8
Apparel 79.8
Restaurant 78.3
Table 77.1
People 76
Female 72.5
Cafeteria 72.5
Face 71.5
Meal 69.3
Food 69.3
Person 69.1
Sleeve 63.1
Couch 59.3
Suit 58.8
Coat 58.8
Overcoat 58.8
Girl 57.4
Woman 57.3
Bowl 56.4

Clarifai
created on 2023-10-26

people 99.7
woman 99.1
group 99
adult 98.4
group together 98.1
man 97.3
sit 96.5
furniture 95.1
indoors 94.1
room 94
table 93.5
four 93.4
three 90
child 87.7
meeting 85.7
family 83.5
actor 83.3
adolescent 82.4
desk 80.7
sitting 78.3

Imagga
created on 2022-01-22

man 39.6
people 35.7
classroom 31.6
male 31.2
person 30.9
room 30.1
business 28.5
group 26.6
businessman 25.6
meeting 25.4
table 25.2
happy 25
adult 24.7
smiling 24.6
office 22.7
suit 21.7
team 21.5
corporate 21.5
indoors 21.1
businesswoman 20.9
women 20.5
cheerful 19.5
couple 19.2
work 18
professional 17.5
businesspeople 17.1
lifestyle 16.6
education 16.4
smile 16.4
sitting 16.3
men 16.3
happiness 15.7
attractive 15.4
desk 15.2
home 15.1
executive 14.9
interior 14.1
holding 14
teacher 13.9
teamwork 13.9
laptop 13.8
musical instrument 13.7
worker 13.3
working 13.2
together 13.1
student 13
coffee 13
indoor 12.8
kitchen 12.5
workplace 12.4
color 12.2
pretty 11.9
communication 11.7
job 11.5
looking 11.2
presentation 11.2
love 11
two 11
portrait 11
dress 10.8
conference 10.7
modern 10.5
enjoyment 10.3
20s 10.1
board 9.9
restaurant 9.8
to 9.7
computer 9.7
friends 9.4
bartender 9.4
study 9.3
casual 9.3
leisure 9.1
confident 9.1
clothing 9.1
stringed instrument 8.9
discussion 8.8
colleagues 8.7
standing 8.7
busy 8.7
food 8.6
face 8.5
drink 8.3
wine 8.3
occupation 8.2
fun 8.2
handsome 8
outfit 7.9
glass 7.8
exam 7.7
drinking 7.6
husband 7.6
notebook 7.6
adults 7.6
togetherness 7.5
life 7.5
horizontal 7.5
document 7.4
technology 7.4
service 7.4
training 7.4
inside 7.4
cup 7.2
black 7.2
school 7.2

Google
created on 2022-01-22

Table 95.9
Furniture 93.9
Black 89.6
Chair 84.3
Style 83.9
Sharing 81.5
Desk 79.8
Adaptation 79.2
Curtain 75
Snapshot 74.3
Room 74.2
Vintage clothing 73.7
Art 71.5
Event 71.2
Writing desk 68.6
Font 68.3
Monochrome 67.4
Stock photography 65.3
Rectangle 63.3
Illustration 62.9

Microsoft
created on 2022-01-22

person 99.8
wall 96.9
text 96.1
clothing 94.5
indoor 93.6
man 93.3
table 93.1
human face 88.6
people 84.7
smile 77
group 76.2
dish 42.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Female, 98.7%
Confused 30.6%
Angry 18.8%
Surprised 18.6%
Disgusted 12.1%
Sad 9.4%
Calm 6.8%
Fear 2.7%
Happy 1%

AWS Rekognition

Age 22-30
Gender Female, 99.8%
Calm 90.9%
Sad 7.8%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%
Fear 0.2%
Surprised 0.1%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 99.1%
Happy 89.8%
Confused 3.9%
Surprised 1.6%
Fear 1.3%
Sad 1.3%
Disgusted 0.8%
Angry 0.8%
Calm 0.5%

AWS Rekognition

Age 24-34
Gender Female, 99.5%
Sad 39.2%
Confused 35.1%
Angry 8.3%
Surprised 4.9%
Happy 4.8%
Disgusted 3.2%
Calm 3%
Fear 1.3%

AWS Rekognition

Age 24-34
Gender Female, 100%
Happy 98.7%
Surprised 0.5%
Fear 0.2%
Confused 0.1%
Angry 0.1%
Sad 0.1%
Disgusted 0.1%
Calm 0.1%

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 99.8%
Sad 0.1%
Happy 0%
Fear 0%
Angry 0%
Surprised 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 24-34
Gender Female, 99.6%
Happy 70.5%
Calm 21.1%
Surprised 2.2%
Sad 2.1%
Confused 1.9%
Angry 0.9%
Disgusted 0.6%
Fear 0.5%

AWS Rekognition

Age 20-28
Gender Female, 99.8%
Happy 69.9%
Calm 9%
Surprised 7.7%
Confused 3.6%
Fear 3.6%
Angry 2.9%
Sad 2.1%
Disgusted 1.2%

AWS Rekognition

Age 56-64
Gender Female, 99.9%
Calm 42.8%
Happy 19.3%
Confused 9.1%
Sad 8.9%
Disgusted 6.3%
Angry 5.9%
Surprised 5%
Fear 2.7%

Microsoft Cognitive Services

Age 50
Gender Female

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 27
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Chair 97.1%

Categories

Imagga

people portraits 96.6%
events parties 1.4%