Human Generated Data

Title

Untitled (young woman giving roses to two others in hall at party)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9665

Human Generated Data

Title

Untitled (young woman giving roses to two others in hall at party)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Human 99.4
Person 99.4
Person 99.3
Person 99.2
Person 99.2
Apparel 98.6
Clothing 98.6
Person 98.2
Footwear 97.5
Shoe 97.5
Person 91.9
Female 83.5
Indoors 74.1
Room 72.9
Home Decor 69.5
People 68.2
Skirt 66.2
Woman 65.8
Shorts 62.6
Furniture 59.4
Girl 59.1
Costume 56.8
Photography 56.6
Photo 56.6

Imagga
created on 2022-01-23

brass 79.4
wind instrument 66.1
musical instrument 44
cornet 36.5
man 28.2
people 24.5
person 21.3
adult 21.2
male 18.4
couple 15.7
business 15.2
sax 13.5
black 13.2
kin 13.2
teacher 12.9
room 12.6
businessman 12.4
portrait 12.3
men 12
happy 11.9
women 11.9
silhouette 11.6
professional 11.5
dark 10.8
interior 10.6
boy 10.4
two 10.2
fashion 9.8
family 9.8
to 9.7
group 9.7
indoors 9.7
style 9.6
office 9.6
home 9.6
sitting 9.4
happiness 9.4
youth 9.4
horn 9.3
device 9.3
performer 9.2
indoor 9.1
girls 9.1
modern 9.1
dress 9
love 8.7
bride 8.6
pretty 8.4
music 8.1
active 8.1
sexy 8
formal 7.6
sport 7.5
human 7.5
friendship 7.5
fun 7.5
educator 7.5
holding 7.4
wedding 7.4
lady 7.3
dancer 7.2
suit 7.2
team 7.2
musician 7.2
job 7.1
together 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

clothing 96.6
text 95.7
person 93.3
woman 91.7
dress 91.3
footwear 89.9
black and white 83.8

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 98.8%
Calm 94.9%
Sad 1.6%
Happy 1.5%
Fear 1%
Angry 0.3%
Surprised 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Shoe 97.5%
Skirt 66.2%

Captions

Microsoft

a group of people standing in front of a wall 78.7%
a group of people standing next to a person 64%
a group of people standing in front of a sign 63.9%

Text analysis

Amazon

p31ee

Google

2 3 J
3
2
J