Human Generated Data

Title

Untitled (man and girl)

Date

c. 1975, from c. 1940 original

People

Artist: Legler Studio, American active 1930s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19708

Human Generated Data

Title

Untitled (man and girl)

People

Artist: Legler Studio, American active 1930s-1950s

Date

c. 1975, from c. 1940 original

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19708

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 98.5
Human 98.5
Suit 98.2
Overcoat 98.2
Coat 98.2
Person 95.2
Face 94.4
Tuxedo 89.2
Sleeve 85.5
Shirt 80.5
Portrait 68.8
Photography 68.8
Photo 68.8
Undershirt 65.3
People 64.5
Female 62

Clarifai
created on 2023-10-22

people 99.6
woman 96.8
two 95.9
adult 95.7
portrait 94.5
man 93.3
family 92.2
wear 90.6
child 90.6
indoors 87.2
art 86.6
window 85.6
group 83.7
doorway 83.7
room 82.8
one 81.7
medical practitioner 79.7
science 79.5
wedding 78
three 77.1

Imagga
created on 2022-03-05

lab coat 25.9
people 25.7
coat 24.7
person 23.7
adult 22.1
portrait 20.1
man 19.5
clothing 19.2
garment 17.6
attractive 16.8
male 15.7
looking 14.4
one 14.2
standing 13.9
lady 13.8
face 13.5
nurse 13.2
happy 13.2
smile 12.8
human 12.7
old 12.5
lifestyle 12.3
fashion 12.1
home 12
hair 11.9
happiness 11.7
black 11.5
room 11.3
sexy 11.2
statue 11
model 10.9
dress 10.8
posing 10.7
medical 10.6
indoors 10.5
body 10.4
alone 10
pretty 9.8
business 9.7
brunette 9.6
women 9.5
love 9.5
men 9.4
bride 8.8
hospital 8.8
window 8.7
bright 8.6
sculpture 8.4
suit 8.3
vintage 8.3
light 8
family 8
businessman 7.9
professional 7.8
modern 7.7
elegant 7.7
health 7.6
casual 7.6
businesspeople 7.6
doctor 7.5
clothes 7.5
single 7.4
wedding 7.4
confident 7.3
pose 7.2
smiling 7.2
office 7.2
fresh 7.2
film 7.1
curtain 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

wall 99
man 98.5
text 97.9
drawing 96.5
sketch 96.4
posing 95.9
clothing 94.5
person 93.8
human face 93
white 85.3
smile 84.9
standing 77.6
old 77.1
black and white 73.3
painting 61.5
picture frame 8.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 100%
Surprised 98.3%
Calm 1.1%
Happy 0.3%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Sad 0.1%
Angry 0%

AWS Rekognition

Age 38-46
Gender Male, 99%
Calm 97.8%
Surprised 0.7%
Happy 0.5%
Sad 0.5%
Confused 0.3%
Disgusted 0.2%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.5%
Person 95.2%

Categories

Text analysis

Amazon

142
EASTMAN-NITRATE-KODAK

Google

EASTMAN-NITRATE-KOoDAK 142
EASTMAN-NITRATE-KOoDAK
142