Human Generated Data

Title

Untitled (three men posing by animal skins)

Date

1951

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6293

Human Generated Data

Title

Untitled (three men posing by animal skins)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.8
Human 99.8
Person 99.4
Person 97
Apparel 92.8
Clothing 92.8
Mammal 82.2
Dog 82.2
Animal 82.2
Canine 82.2
Pet 82.2
Outdoors 81.4
Military Uniform 80.8
Military 80.8
Nature 78.2
Soldier 78.1
Face 76.2
Photography 62.1
Portrait 62.1
Photo 62.1
Coat 61.7
Army 60
Armored 60
Overcoat 59.6

Imagga
created on 2022-01-22

laptop 71
windshield 58.2
computer 51.8
screen 48.6
office 39.4
business 36.4
protective covering 35.4
working 35.3
work 34.5
man 33.6
person 30.7
sitting 29.2
people 28.4
notebook 28.4
adult 27.2
businessman 25.6
desk 25.5
male 24.8
technology 24.5
worker 24
covering 23.7
corporate 23.2
job 23
happy 22.5
television 22.5
businesswoman 19.1
smiling 18.8
smile 18.5
executive 17.6
attractive 17.5
handsome 16.9
keyboard 16.9
portrait 16.8
looking 16.8
professional 16.1
casual 16.1
table 15.6
suit 15.3
wireless 15.2
communication 15.1
modern 14.7
education 14.7
portable computer 14.6
telecommunication system 14.4
face 14.2
success 13.7
student 13.6
home 13.6
pretty 13.3
indoors 13.2
lifestyle 13
confident 12.7
manager 12.1
one 11.9
typing 11.7
cheerful 11.4
personal computer 11
happiness 11
lady 10.5
together 10.5
reading 10.5
businesspeople 10.4
glasses 10.2
phone 10.1
women 9.5
career 9.5
men 9.4
mature 9.3
successful 9.1
indoor 9.1
interior 8.8
halibut 8.8
busy 8.7
stress 8.6
cute 8.6
workplace 8.6
tie 8.5
sit 8.5
shirt 8.5
contemporary 8.5
hand 8.4
using 7.7
sofa 7.6
employee 7.6
boss 7.6
two 7.6
squeegee 7.6
mobile 7.5
meeting 7.5
learning 7.5
digital computer 7.4
20s 7.3
alone 7.3
monitor 7.3
black 7.2
hair 7.1
family 7.1
day 7.1
paper 7.1
flatfish 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

person 99.6
text 97.8
black and white 74.3
clothing 59.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 100%
Calm 34%
Surprised 32.8%
Sad 11.6%
Confused 8.2%
Disgusted 4.6%
Angry 3.5%
Fear 2.8%
Happy 2.6%

AWS Rekognition

Age 37-45
Gender Male, 99.8%
Calm 98.6%
Sad 1%
Confused 0.2%
Surprised 0.1%
Angry 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 98.2%
Happy 98.4%
Calm 0.7%
Surprised 0.2%
Angry 0.2%
Disgusted 0.2%
Sad 0.1%
Confused 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Dog 82.2%

Captions

Microsoft

a group of people around each other 76.5%
a group of people playing instruments and performing on a stage 54%
a group of people riding skis on a snowy surface 29.2%

Text analysis

Amazon

S
TS3
421S
421S I
I
DO
- DO
-
Ministry

Google

-4213 YT37A2-XAGOX
-4213
YT37A2-XAGOX