Human Generated Data

Title

Untitled

Date

1950s

People

Artist: Leon Levinstein, American 1910 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.416

Human Generated Data

Title

Untitled

People

Artist: Leon Levinstein, American 1910 - 1988

Date

1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of John Erdman and Gary Schneider from the Helen Gee Collection, 2016.416

Machine Generated Data

Tags

Amazon
created on 2023-07-06

Adult 99.2
Male 99.2
Man 99.2
Person 99.2
Hugging 97.9
Face 97.1
Head 97.1
Photography 94.3
Portrait 94.3
Hand 90.5
Body Part 90.5
Finger 90.5
Person 89.5

Clarifai
created on 2023-10-13

people 99.7
monochrome 99.4
portrait 98.9
child 98.4
two 97.9
baby 97
adult 96
love 95.8
son 95.5
interaction 95.3
family 95.1
woman 94.8
street 94.5
affection 92.6
man 92.1
black and white 91.9
offspring 90.3
girl 90.3
group 87
administration 84.5

Imagga
created on 2023-07-06

portrait 30.4
people 25.1
face 23.4
statue 22.7
person 22.5
adult 20.7
male 20.2
man 20.2
child 19.7
world 19.3
old 17.4
love 15.8
happy 15.7
parent 15.1
one 14.9
sculpture 14.7
black 14.4
head 14.3
hand 13.7
hair 13.5
dad 13.2
father 12.5
looking 12
crazy 11.8
couple 11.3
human 11.2
attractive 11.2
smile 10.7
lady 10.5
art 10.5
ancient 10.4
men 10.3
women 10.3
happiness 10.2
smiling 10.1
stone 10.1
outdoor 9.9
mother 9.9
fashion 9.8
outdoors 9.7
closeup 9.4
culture 9.4
lifestyle 9.4
model 9.3
cute 9.3
blond 9.2
pretty 9.1
handsome 8.9
juvenile 8.8
grandfather 8.7
antique 8.7
day 8.6
religion 8.1
family 8
close 8
coat 7.9
architecture 7.8
eyes 7.7
summer 7.7
expression 7.7
married 7.7
joy 7.5
city 7.5
vintage 7.4
emotion 7.4
alone 7.3
life 7.2
body 7.2
little 7.1

Google
created on 2023-07-06

Microsoft
created on 2023-07-06

human face 93.3
text 83.4
black and white 80.9
statue 73.2
person 62.8
portrait 57.5
close 21.9
staring 19.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Female, 99.6%
Fear 98%
Surprised 6.4%
Sad 2.2%
Angry 0%
Disgusted 0%
Calm 0%
Happy 0%
Confused 0%

Microsoft Cognitive Services

Age 30
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.2%
Male 99.2%
Man 99.2%
Person 99.2%

Categories

Imagga

paintings art 60%
people portraits 39.7%

Captions

Microsoft
created on 2023-07-06

a close up of a dog 46.1%
a close up of a mans face 46%
a close up of an animal 45.9%