Human Generated Data

Title

Untitled (two men standing near Lockhart Implement Company)

Date

1947

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2871

Human Generated Data

Title

Untitled (two men standing near Lockhart Implement Company)

People

Artist: Harry Annas, American 1897 - 1980

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-16

Person 99.9
Human 99.9
Clothing 99.6
Apparel 99.6
Person 98.8
Shorts 92.5
Female 83
Face 80.9
Person 79.6
Outdoors 75
Pants 73.3
Hat 71.7
People 64.6
Woman 64.1
Nature 62.6
Girl 61.8
Photography 60.3
Photo 60.3
Cap 60
Suit 58.3
Coat 58.3
Overcoat 58.3
Tire 55.3

Imagga
created on 2022-01-16

adult 27.2
person 24.9
people 23.4
attractive 23.1
happy 21.3
model 20.2
hair 19
sexy 18.5
pretty 18.2
portrait 18.1
water 18
fashion 17.3
man 16.8
child 16.6
one 15.7
outdoors 15.2
blond 14.9
couple 14.8
summer 14.8
fun 14.2
dark 14.2
happiness 14.1
male 14
kin 13.8
lady 13.8
sibling 13.6
wet 13.4
outdoor 13
smile 12.8
body 12.8
dress 12.6
love 12.6
style 12.6
leisure 12.5
world 12.4
park 12.3
smiling 12.3
erotic 12.3
call 12.2
human 12
beach 11.9
women 11.9
sensual 11.8
lifestyle 11.6
rain 11.3
free 11.3
skin 11
posing 10.7
little 10.6
enjoy 10.3
active 10.3
umbrella 10.2
face 9.9
splashes 9.8
shower 9.7
sun 9.7
casual 9.3
swing 9.3
joy 9.2
hand 9.1
sensuality 9.1
black 9
sand 8.8
passionate 8.8
standing 8.7
seductive 8.6
pleasure 8.5
two 8.5
passion 8.5
studio 8.4
action 8.3
joyful 8.3
clothing 8.2
vacation 8.2
family 8
cute 7.9
jumping 7.7
outside 7.7
youth 7.7
sport 7.5
drops 7.5
playing 7.3
recreation 7.2
kid 7.1
sea 7
autumn 7
sky 7
together 7

Google
created on 2022-01-16

Microsoft
created on 2022-01-16

outdoor 98.8
text 97.9
person 96.6
clothing 87.9
man 87.6
standing 82.2
posing 68.2
black and white 60.7
old 45.8

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 92.9%
Calm 74.7%
Happy 8.2%
Sad 6.3%
Angry 4.5%
Surprised 2.7%
Confused 2.2%
Fear 0.7%
Disgusted 0.6%

AWS Rekognition

Age 43-51
Gender Male, 100%
Happy 56.3%
Calm 37.7%
Surprised 1.8%
Sad 1.4%
Fear 1%
Confused 0.7%
Disgusted 0.6%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Captions

Microsoft

a man standing in front of a building 89.6%
a group of baseball players standing on top of a building 61.6%
a group of baseball players posing for a photo 61.5%

Text analysis

Amazon

KODAK-
KODAK- STEEL
STEEL

Google

KODVK
KODVK