Human Generated Data

Title

Untitled (soldiers walking through field led by Vietnamese women and children, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.142.3

Human Generated Data

Title

Untitled (soldiers walking through field led by Vietnamese women and children, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.142.3

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.3
Human 99.3
Person 98.7
Clothing 97.7
Apparel 97.7
Person 97.4
Person 95.3
Face 82.6
Coat 71.2
Person 69.2
Head 66.5
Suit 65.2
Overcoat 65.2
People 62.4
Hat 61.3
Photography 61.1
Photo 61.1
Brick 59.7

Clarifai
created on 2023-10-22

people 99.5
monochrome 98.8
woman 96.5
art 96.2
portrait 94.9
man 94.6
street 93.1
adult 91.9
girl 89
lid 87.1
child 86.9
wedding 86.2
light 84
one 83.6
music 82.3
veil 78.6
group together 78.4
black and white 77.9
winter 77.5
group 77.4

Imagga
created on 2021-12-14

screw 29.8
propeller 23.5
black 21.9
man 18.8
mechanical device 18.6
fountain 18.1
smoke 16.7
structure 16.3
portrait 16.2
light 16
person 15.2
umbrella 14
sunglass 14
car mirror 13.9
mechanism 13.9
hair 13.5
dark 13.4
people 12.3
face 12.1
mirror 11.6
adult 11.6
pretty 11.2
protection 10.9
industrial 10.9
mask 10.9
factory 10.7
hand 10.6
sexy 10.4
rock 10.4
sunglasses 10.4
negative 10
attractive 9.8
worker 9.8
art 9.8
human 9.7
one 9.7
construction 9.4
industry 9.4
power 9.2
safety 9.2
canopy 9
steel 8.8
protective covering 8.8
women 8.7
water 8.7
device 8.6
flame 8.4
modern 8.4
hot 8.4
reflector 8.3
occupation 8.2
music 8.1
metal 8
night 8
job 8
manufacturing 7.8
male 7.8
film 7.7
motion 7.7
expression 7.7
outdoor 7.6
workplace 7.6
studio 7.6
happy 7.5
danger 7.3
sun 7.2
eye 7.1
work 7.1
sea 7
hat 7
sky 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 97.2
text 95.6
black and white 92.2
human face 84.4
monochrome 79.7
wedding dress 77.9
bride 57.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 36-54
Gender Female, 92%
Calm 41.1%
Fear 35.3%
Sad 12%
Surprised 5.8%
Happy 2.7%
Angry 1.6%
Confused 0.9%
Disgusted 0.7%

AWS Rekognition

Age 40-58
Gender Female, 60.7%
Calm 91.4%
Sad 6.7%
Happy 0.9%
Confused 0.6%
Angry 0.3%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 38-56
Gender Male, 55.4%
Calm 75.6%
Happy 9.2%
Sad 7.6%
Angry 3%
Confused 1.5%
Surprised 1.2%
Fear 1.1%
Disgusted 0.8%

AWS Rekognition

Age 21-33
Gender Male, 87.3%
Calm 78.5%
Happy 15.7%
Sad 3.7%
Angry 0.9%
Fear 0.9%
Disgusted 0.1%
Surprised 0.1%
Confused 0.1%

AWS Rekognition

Age 28-44
Gender Female, 95.2%
Calm 65.5%
Confused 17.1%
Happy 11.2%
Sad 2.6%
Angry 1.3%
Surprised 1.2%
Fear 0.7%
Disgusted 0.5%

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

pets animals 99.4%

Captions

Microsoft
created on 2021-12-14

an old photo of a person 60.5%
an old photo of a person 59.6%
old photo of a person 54.5%

Text analysis

Amazon

V2
C