Human Generated Data

Title

Cambridge, Mass.

Date

1981

People

Artist: Sage Sohier, American born 1954

Publisher: Photographic Resource Center,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the Photographic Resource Center, 2.2002.1047

Copyright

© Sage Sohier

Human Generated Data

Title

Cambridge, Mass.

People

Artist: Sage Sohier, American born 1954

Publisher: Photographic Resource Center,

Date

1981

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Apparel 100
Shorts 100
Clothing 100
Footwear 99.9
Shoe 99.9
Human 99.8
Person 99.8
Person 99.8
Person 99.6
Person 99.6
Shoe 99
Person 98.9
Shoe 98.6
Shoe 97.7
Shoe 81.3
People 61.4

Imagga
created on 2022-01-09

person 36.8
adult 32.6
people 29.6
man 26.2
fashion 24.1
happy 23.8
attractive 23.8
smile 22.1
sexy 20.9
pretty 19.6
smiling 19.5
lady 18.7
women 18.2
clothing 17.9
couple 17.4
happiness 17.2
lifestyle 16.6
model 15.6
active 15
male 15
style 14.8
black 14.8
portrait 13.6
two 13.6
outdoors 13.1
standing 13
casual 12.7
bike 12.7
hair 12.7
dress 12.6
outfit 12
20s 11.9
bag 11.8
equipment 11.5
business 11.5
cute 11.5
garment 11.4
indoors 11.4
cheerful 11.4
outside 11.1
expression 11.1
bicycle 11
together 10.5
body 10.4
clothes 10.3
fit 10.1
face 9.9
holding 9.9
jean 9.7
hand 9.1
confident 9.1
stylish 9
health 9
crutch 9
healthy 8.8
urban 8.7
ride 8.7
full length 8.7
high 8.7
dark 8.4
slim 8.3
human 8.2
professional 8.2
sensuality 8.2
guy 8.1
posing 8
cycling 7.9
riding 7.8
corporate 7.7
sitting 7.7
sport 7.6
skirt 7.6
enjoying 7.6
energy 7.6
city 7.5
one 7.5
student 7.4
joyful 7.3
teenager 7.3
sensual 7.3
businesswoman 7.3
office 7.2
blond 7.2
trouser 7.1
to 7.1
interior 7.1
summer 7.1
look 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.8
posing 98.3
clothing 96
text 93.9
outdoor 91.6
woman 87.6
standing 86.3
dress 85.3
smile 84.2
dance 81.5
group 77.3
people 69.4
man 64.4

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 9-17
Gender Male, 72.3%
Calm 97.9%
Confused 1.6%
Sad 0.2%
Angry 0.1%
Surprised 0.1%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 21-29
Gender Male, 99.9%
Calm 60%
Sad 15.6%
Fear 9%
Confused 6.8%
Disgusted 3.2%
Surprised 2.9%
Angry 1.5%
Happy 1%

AWS Rekognition

Age 28-38
Gender Female, 88.8%
Calm 96.9%
Sad 1.4%
Happy 0.7%
Angry 0.4%
Disgusted 0.3%
Fear 0.2%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 4-12
Gender Female, 99.3%
Calm 76.2%
Surprised 9.5%
Disgusted 3.3%
Happy 3%
Angry 2.4%
Fear 2.1%
Sad 2%
Confused 1.6%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 99.8%
Sad 0.1%
Surprised 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

Microsoft Cognitive Services

Age 15
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Shoe 99.9%
Person 99.8%

Captions

Microsoft

a group of people posing for a photo 98.5%
a group of people posing for the camera 98.4%
a group of people posing for a picture 98.3%