Human Generated Data

Title

Untitled (soldiers milling around in front of thatched huts, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.101.3

Human Generated Data

Title

Untitled (soldiers milling around in front of thatched huts, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.101.3

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.9
Apparel 99.9
Person 99.8
Human 99.8
Person 99.7
Person 99.6
Person 99.5
Person 99.2
Person 98.4
Coat 94.9
Person 93
Person 92.4
Person 91.1
Person 91.1
Nature 79.8
Outdoors 78.2
Shoe 78
Footwear 78
Person 70.6
People 63.5
Overcoat 63.4
Person 60.1
Shoe 59.5
Raincoat 59.2
Hat 55.6

Clarifai
created on 2023-10-22

people 100
group together 99.5
group 98.3
adult 96.8
man 96.4
child 95.4
many 95
wear 94.1
several 93.9
woman 91.3
street 90.1
umbrella 89.3
recreation 89.1
boy 87.8
outfit 85.1
vehicle 83.3
five 82
three 79.5
veil 79.4
four 78.7

Imagga
created on 2022-01-23

shelter 56
canvas tent 51.8
hut 48.2
structure 38
tent 28
umbrella 24.9
people 22.3
man 19.5
mountain tent 19.1
outdoors 15.1
person 14.9
male 14.9
adult 14.9
yurt 13.9
old 12.5
leisure 12.4
parasol 12
travel 12
two 11.9
protection 11.8
camping 11.8
summer 11.6
vacation 11.5
car 11.1
dwelling 11
activity 10.7
outdoor 10.7
vehicle 10.5
rain 10.4
camp 9.8
mountain 9.8
adventure 9.5
work 9.4
housing 9.3
color 8.9
equipment 8.8
couple 8.7
forest 8.7
day 8.6
men 8.6
canopy 8.5
vacations 8.5
holding 8.2
human 8.2
sport 8.2
fun 8.2
happy 8.1
recreation 8.1
lifestyle 7.9
love 7.9
together 7.9
holiday 7.9
hiking 7.7
engine 7.7
weather 7.2
smiling 7.2
road 7.2
active 7.2
transportation 7.2
grass 7.1
job 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.5
clothing 98.3
text 95.6
man 92.9
black and white 83.7
people 73.5
footwear 70.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 96.4%
Calm 51.9%
Sad 18.6%
Surprised 8.4%
Disgusted 6.2%
Confused 6%
Fear 3.8%
Angry 3.3%
Happy 1.9%

AWS Rekognition

Age 16-24
Gender Female, 65.9%
Sad 80.9%
Calm 6%
Happy 3%
Fear 2.9%
Angry 2.3%
Surprised 2.1%
Disgusted 1.6%
Confused 1.2%

AWS Rekognition

Age 11-19
Gender Male, 98%
Calm 81.3%
Sad 4.7%
Surprised 4.5%
Angry 4.2%
Happy 2%
Fear 1.1%
Disgusted 1.1%
Confused 1%

AWS Rekognition

Age 30-40
Gender Male, 95.1%
Calm 80.5%
Sad 15%
Happy 1.6%
Confused 1.1%
Disgusted 0.9%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Female, 99.7%
Calm 68.9%
Happy 20.9%
Confused 2.6%
Surprised 2.3%
Sad 2.3%
Fear 1.4%
Disgusted 0.8%
Angry 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 78%

Categories