Human Generated Data

Title

Untitled (two men displaying large fish)

Date

1952-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6351

Human Generated Data

Title

Untitled (two men displaying large fish)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952-1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.7
Human 99.7
Person 99.5
Clothing 99.1
Apparel 99.1
Pants 98.8
Person 91.9
Sleeve 90.8
Jeans 77
Denim 77
Face 75.8
Man 71.3
Long Sleeve 70.4
Portrait 62.4
Photography 62.4
Photo 62.4
Footwear 59.9
Shoe 59.4
Coat 57.8
Skin 56.6
Shorts 55.4
Jacket 55.3
Hat 55.1

Imagga
created on 2022-01-22

man 38.9
male 37.6
person 34.5
black 26.6
people 25.1
portrait 23.9
clothing 22.8
fashion 21.1
adult 20.9
hat 19.8
face 19.2
model 18.6
uniform 17.6
jacket 17.1
handsome 16
pose 15.4
casual 15.2
suit 14.5
studio 14.4
style 14.1
guy 13.9
art 12.9
expression 12.8
posing 12.4
job 12.4
men 12
occupation 11.9
dark 11.7
worker 11.5
human 11.2
attractive 11.2
body 11.2
helmet 11.1
happy 10.6
military uniform 10.5
sexy 10.4
hand 9.9
modern 9.8
old 9.7
lady 9.7
business 9.7
businessman 9.7
one 9.7
clothes 9.4
covering 9.1
dress 9
private 9
cool 8.9
professional 8.9
success 8.8
dressed 8.8
standing 8.7
lifestyle 8.7
work 8.6
smile 8.5
elegance 8.4
costume 8.3
boy 8.1
looking 8
look 7.9
warrior 7.8
serious 7.6
confident 7.3
stylish 7.2
religion 7.2
hair 7.1
happiness 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.1
text 99.1
clothing 98.1
man 92.4
standing 92
posing 82.4
black 70.4
old 46.2

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 93.1%
Calm 99%
Sad 0.4%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Disgusted 0%
Confused 0%
Fear 0%

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 62%
Happy 14.5%
Surprised 10.9%
Confused 4.9%
Disgusted 3.7%
Sad 2.4%
Angry 1.1%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Hat 55.1%

Captions

Microsoft

a group of people posing for a photo 96.5%
a group of men posing for a photo 96.1%
a group of people posing for the camera 96%

Text analysis

Amazon

TS3
KODAS

Google

KODVK-2VEEIA T53
KODVK-2VEEIA
T53