Human Generated Data

Title

Untitled (aboard a car)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1944

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (aboard a car)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.2
Person 99.2
Person 97.9
Person 95.6
Person 89.8
Person 88.4
Leisure Activities 68.7
Apparel 68.6
Clothing 68.6
Face 60.5
Performer 58.9
Crowd 58.6
Female 56.1
Girl 56
Musician 55.1
Musical Instrument 55.1

Imagga
created on 2022-01-08

man 39.6
people 32.9
male 32.5
adult 29.4
person 28.4
couple 26.1
happy 25.7
happiness 25.1
smiling 24.6
together 22.8
group 21.8
meeting 21.7
sitting 21.5
attractive 21
business 20
businesswoman 20
home 19.1
women 19
child 18.4
professional 18
team 17.9
family 17.8
businessman 17.7
love 17.4
corporate 17.2
businesspeople 17.1
office 17
teamwork 16.7
smile 16.4
patient 16.1
casual 16.1
looking 16
lifestyle 15.9
work 15.7
executive 15.7
laptop 15.5
stethoscope 15.2
mother 14.8
couch 14.5
computer 14.4
handsome 14.3
working 14.1
musical instrument 13.8
successful 13.7
portrait 13.6
two 13.6
talking 13.3
indoors 13.2
men 12.9
20s 12.8
colleagues 12.6
partner 12.6
room 12.5
job 12.4
wind instrument 12.4
cheerful 12.2
success 12.1
flute 12.1
suit 11.7
holding 11.6
adults 11.4
father 10.7
relationship 10.3
woodwind 10.1
medical instrument 10.1
daughter 10.1
confident 10
husband 9.9
worker 9.8
affectionate 9.7
girlfriend 9.6
partnership 9.6
student 9.6
boy 9.6
table 9.5
career 9.5
friends 9.4
model 9.3
sick person 9.3
indoor 9.1
case 9.1
fun 9
device 9
romance 8.9
partners 8.7
boyfriend 8.7
30s 8.7
married 8.6
loving 8.6
black 8.5
face 8.5
females 8.5
togetherness 8.5
instrument 8.5
baby 8.4
pretty 8.4
joy 8.4
coffee 8.3
fashion 8.3
playing 8.2
stringed instrument 8.1
life 8.1
romantic 8
brunette 7.8
consultant 7.8
nurse 7.7
talk 7.7
staff 7.7
laughing 7.6
communication 7.6
house 7.5
camera 7.5
manager 7.5
technology 7.4
brother 7.3
occupation 7.3
color 7.2
sexy 7.2
cute 7.2
parent 7.1

Google
created on 2022-01-08

Black-and-white 85.1
Style 84
People 78.1
Jacket 77.5
Monochrome 74.5
Monochrome photography 72.5
Happy 72.4
Sitting 67.7
Fur 63.6
Fun 61.9
Street 57.9
Child 56.5
Hoodie 54.3
Vintage clothing 53.5
Conversation 51

Microsoft
created on 2022-01-08

person 99.5
clothing 94.4
outdoor 89.9
human face 89.8
black and white 88.1
man 58.2
monochrome 56.3
woman 52.7
crowd 6.2

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 100%
Calm 74%
Sad 17%
Happy 2.5%
Disgusted 1.7%
Angry 1.5%
Confused 1.4%
Fear 1%
Surprised 0.8%

AWS Rekognition

Age 23-31
Gender Male, 64.6%
Sad 83.6%
Calm 15.2%
Happy 0.9%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%
Confused 0%
Surprised 0%

AWS Rekognition

Age 10-18
Gender Female, 96.4%
Calm 81.4%
Sad 15.4%
Fear 1.5%
Disgusted 0.6%
Angry 0.4%
Confused 0.3%
Happy 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a group of people sitting on a bench 85.3%
a group of people sitting next to a window 85.2%
a group of people sitting in front of a window 85.1%