Human Generated Data

Title

Untitled (woman taking corsage from girl during masonic ceremony with audience watching)

Date

1951

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9360

Human Generated Data

Title

Untitled (woman taking corsage from girl during masonic ceremony with audience watching)

People

Artist: Martin Schweig, American 20th century

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 95.5
Human 95.5
Person 95
Indoors 94.8
Room 94.1
Person 93.2
Shop 92
Person 92
Person 88.3
Person 83.8
Boutique 78.8
Person 77.2
Dressing Room 76.6
Person 67.6
Window Display 66.2
Mannequin 57.9

Imagga
created on 2022-01-23

man 26.9
people 25.1
business 23.1
male 20.6
adult 20.1
musical instrument 19.1
blackboard 19.1
person 18.8
businessman 18.5
wind instrument 17.8
case 15.9
corporate 13.7
sax 13.7
professional 13.6
dress 13.5
happy 13.2
office 12.8
brass 12.7
love 12.6
black 12.6
fashion 12.1
men 12
women 11.9
work 11.8
happiness 11.7
executive 11.6
job 11.5
couple 11.3
businesswoman 10.9
clothing 10.9
studio 10.6
room 10.6
interior 10.6
modern 10.5
style 10.4
two 10.2
outfit 10.1
silhouette 9.9
suit 9.9
bride 9.6
boss 9.6
device 9.5
stage 9.2
musician 9.1
stringed instrument 9
cornet 8.9
harp 8.9
success 8.8
working 8.8
indoors 8.8
lifestyle 8.7
musical 8.6
groom 8.6
youth 8.5
design 8.4
guitar 8.4
portrait 8.4
hand 8.3
music 8.3
bass 8.3
wedding 8.3
indoor 8.2
team 8.1
handsome 8
computer 8
teacher 8
smiling 8
model 7.8
play 7.7
sitting 7.7
communication 7.6
clothes 7.5
fun 7.5
life 7.4
symbol 7.4
lady 7.3
group 7.3
celebration 7.2
smile 7.1
support 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99
dress 87.1
cartoon 74.9
clothing 71.2
black and white 67.6

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 95.4%
Calm 99.4%
Happy 0.4%
Surprised 0.1%
Sad 0.1%
Fear 0%
Disgusted 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 35-43
Gender Female, 94.2%
Calm 92.9%
Sad 3.3%
Happy 2%
Surprised 0.7%
Confused 0.4%
Fear 0.3%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 18-24
Gender Female, 88.6%
Calm 64.4%
Sad 21.6%
Confused 4.6%
Angry 3.2%
Surprised 1.9%
Happy 1.6%
Disgusted 1.4%
Fear 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.5%

Captions

Microsoft

a group of people around each other 66.7%
a group of people standing next to a window 55.5%
a group of people in a room 55.4%

Text analysis

Amazon

e
KODVK-EVEELA
# 3 3 8 2

Google

te 3 8 2
3
2
8
te