Human Generated Data

Title

Untitled (women sewing in factory at Delta Manufacturing Company)

Date

c. 1940

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2281

Human Generated Data

Title

Untitled (women sewing in factory at Delta Manufacturing Company)

People

Artist: Harry Annas, American 1897 - 1980

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Person 99.7
Human 99.7
Person 99.6
Person 99
Person 95.9
Kid 94.1
Child 94.1
Female 94.1
Girl 94.1
Blonde 94.1
Teen 94.1
Woman 94.1
Clothing 91.6
Apparel 91.6
Meal 79.2
Food 79.2
Face 77.2
Coat 73.3
Overcoat 73.3
Suit 73.3
Sitting 71.3
People 69.9
Dish 65.6
Indoors 61.9
Table 59.4
Furniture 59.4
Text 58.4
Room 57.8
Sleeve 56.9
Person 46.6

Imagga
created on 2022-01-30

laptop 36.5
man 34.3
grand piano 33.8
computer 33.7
people 32.3
adult 31.3
business 29.1
office 29
piano 27.4
happy 26.3
male 25.5
work 25.1
person 24
working 23
indoors 22
stringed instrument 21.3
keyboard instrument 20.6
percussion instrument 20.3
sitting 19.7
smiling 19.5
technology 19.3
home 18.3
worker 17.9
professional 17.9
musical instrument 17.4
women 17.4
businesswoman 17.3
notebook 17
lifestyle 16.6
job 15.9
table 15.8
smile 15.7
corporate 15.5
men 15.4
modern 14.7
desk 14.3
portrait 14.2
businessman 14.1
indoor 13.7
casual 13.5
groom 13.5
suit 13.5
communication 13.4
couple 13.1
phone 12.9
room 12.7
cheerful 12.2
education 12.1
happiness 11.7
looking 11.2
teacher 11.2
love 11
two 11
successful 11
salon 10.5
together 10.5
fun 10.5
success 10.5
businesspeople 10.4
sit 10.4
meeting 10.4
team 9.8
attractive 9.8
gondola 9.8
interior 9.7
senior 9.4
manager 9.3
pretty 9.1
lady 8.9
cleaner 8.8
formal 8.6
executive 8.6
talking 8.5
face 8.5
keyboard 8.4
mature 8.4
house 8.4
student 8.3
building 8.3
one 8.2
handsome 8
boy 7.8
class 7.7
telephone 7.7
old 7.7
elderly 7.7
workplace 7.6
friends 7.5
monitor 7.5
leisure 7.5
boat 7.4
emotion 7.4
alone 7.3
group 7.3
hair 7.1
clothing 7.1

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

black and white 91.9
clothing 90.7
person 87.5
woman 86.9
text 76.7
man 50.4

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Male, 97.8%
Calm 98.2%
Confused 0.6%
Surprised 0.3%
Sad 0.3%
Disgusted 0.2%
Happy 0.2%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Happy 70.4%
Angry 15.8%
Sad 4.3%
Disgusted 3.1%
Calm 2.1%
Surprised 2.1%
Confused 1.7%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people sitting at a table 78%
a group of people sitting around a table 76.9%
a group of people standing around a table 76.8%