Human Generated Data

Title

Untitled (three men dressed in togas for Kiwanis passion play)

Date

c. 1950

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13432

Human Generated Data

Title

Untitled (three men dressed in togas for Kiwanis passion play)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-30

Clothing 99.9
Apparel 99.9
Person 98.3
Human 98.3
Person 97.5
Person 96.5
Fashion 94.5
Cloak 92.4

Imagga
created on 2022-01-30

man 42.9
metropolitan 34.1
male 32.6
person 32.6
senior 29.9
people 29.5
portrait 24.6
adult 24.1
business 21.2
businessman 20.3
old 20.2
face 19.9
happy 17.5
mature 16.7
black 15.8
suit 15.8
office 15.5
men 15.4
handsome 15.1
couple 14.8
lifestyle 14.4
hand 14.4
clothing 14.3
shower cap 14.1
expression 13.6
smile 13.5
elderly 13.4
cap 13.3
tie 13.3
manager 13
glasses 12
corporate 12
looking 12
professional 11.7
executive 11.3
human 11.2
hair 11.1
groom 11
work 11
retired 10.6
together 10.5
love 10.2
smiling 10.1
attractive 9.8
success 9.6
retirement 9.6
serious 9.5
happiness 9.4
shirt 9.3
head 9.2
headdress 9.2
indoor 9.1
gray 9
jacket 9
job 8.8
hands 8.7
fun 8.2
worker 8
60s 7.8
eyes 7.7
modern 7.7
covering 7.7
married 7.7
health 7.6
casual 7.6
marriage 7.6
studio 7.6
adults 7.6
career 7.6
alone 7.3
room 7.3
gown 7.3
dress 7.2
women 7.1
medical 7.1
indoors 7

Google
created on 2022-01-30

Microsoft
created on 2022-01-30

concert 95.8
person 95.2
standing 94.8
text 91.3
clothing 87.3
man 81.9
microphone 72.3
people 66.3
white 60.3
music 53.9
human face 50.3

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 95.1%
Calm 99.2%
Happy 0.5%
Surprised 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 13-21
Gender Male, 98.1%
Calm 88.2%
Sad 10.6%
Happy 0.3%
Confused 0.3%
Angry 0.2%
Disgusted 0.2%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.3%

Captions

Microsoft

a group of people standing in front of a window 93%
a group of people standing next to a window 92.9%
a group of people posing for a photo 92.8%

Text analysis

Amazon

KODAK--2AITW

Google

MJI7--YT37A°2 - - XAGON
-
XAGON
MJI7--YT37A°2