Human Generated Data

Title

Vietnam

Date

c. 1970

People

Artist: Philip Jones Griffiths, British, Welsh 1936 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1985.12

Human Generated Data

Title

Vietnam

People

Artist: Philip Jones Griffiths, British, Welsh 1936 -

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.5
Human 98.5
Clothing 82.6
Apparel 82.6
Leisure Activities 62
People 60.8
Building 57.5
Furniture 56.4

Imagga
created on 2021-12-14

sombrero 100
hat 100
headdress 74.7
clothing 53.3
covering 26.4
consumer goods 25.1
black 24.4
sexy 20.1
portrait 19.4
fashion 18.8
person 18.1
adult 16.8
model 15.5
attractive 15.4
people 15.1
style 14.8
face 14.2
pretty 14
dress 13.5
studio 12.9
hair 12.7
art 12.5
man 12.1
grunge 11.9
carnival 11.7
retro 11.5
male 11.3
make 10.9
dark 10.9
costume 10.7
elegance 10.1
makeup 10.1
sensuality 10
human 9.7
brunette 9.6
party 9.5
elegant 9.4
decorative 9.2
vintage 9.1
decoration 9.1
music 9
one 9
masquerade 8.9
conceptual 8.8
culture 8.5
design 8.5
slick 8.2
pattern 8.2
sensual 8.2
lady 8.1
body 8
love 7.9
artistic 7.8
expression 7.7
hand 7.6
graphic 7.3
digital 7.3
mask 7.2
cute 7.2
smile 7.1
romantic 7.1
women 7.1
posing 7.1
night 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 100
book 99.9
drawing 95.9
person 94
black and white 92.9
clothing 91.8
sketch 84.1
cartoon 76.2
human face 72.1
poster 64.6
monochrome 54.6
painting 51.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 24-38
Gender Female, 71.3%
Calm 78.3%
Sad 19.9%
Confused 0.6%
Fear 0.5%
Surprised 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 2-8
Gender Female, 56.4%
Sad 99%
Calm 0.9%
Fear 0.1%
Angry 0%
Confused 0%
Surprised 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 22-34
Gender Male, 76.4%
Sad 94%
Surprised 2.8%
Confused 1.2%
Fear 0.9%
Calm 0.8%
Angry 0.2%
Disgusted 0.2%
Happy 0%

AWS Rekognition

Age 36-52
Gender Male, 83.8%
Calm 72.5%
Sad 22.9%
Surprised 2.3%
Confused 0.7%
Happy 0.5%
Angry 0.5%
Fear 0.3%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Captions

Microsoft

a person holding a book 50.7%
a person sitting on top of a book 44.2%
a person sitting on a book 41.7%

Text analysis

Amazon

YKIEN
few
few families
GIAI
families
Childhin