Human Generated Data

Title

Untitled (girl seated on studio prop with large straw hat over shoulders, stuffed dog by feet)

Date

c. 1950, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12766

Human Generated Data

Title

Untitled (girl seated on studio prop with large straw hat over shoulders, stuffed dog by feet)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1950, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12766

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 96.3
Person 96.3
Toy 78.7
Apparel 78.1
Clothing 78.1
Dog 68.4
Mammal 68.4
Pet 68.4
Animal 68.4
Canine 68.4
Art 66.5
Bear 65.2
Wildlife 65.2
Female 57.9
Figurine 55.8

Clarifai
created on 2019-11-16

people 99.9
adult 99
portrait 97
two 96.7
one 96.6
woman 96.2
wear 96.2
man 93.7
monochrome 93.2
music 91.9
actress 88.8
outfit 85.2
furniture 84.4
actor 82.4
dancer 81.9
art 81.8
facial expression 81.1
group 79.9
retro 79.8
musician 79

Imagga
created on 2019-11-16

person 29.2
portrait 29.1
adult 28
people 25.7
attractive 19.6
man 19.5
pretty 18.9
fashion 18.1
model 17.9
domestic 17.6
male 17.1
bride 16.4
dress 16.3
black 16.2
hair 15.8
face 15.6
sitting 15.5
happy 15
clothing 14.8
sexy 14.5
smile 14.2
professional 14
brunette 13.9
smiling 13.7
couple 13.1
lifestyle 13
women 12.7
elegance 12.6
lady 12.2
wedding 12
style 11.9
human 11.2
men 11.2
love 11
posing 10.7
fun 10.5
one 10.4
health 10.4
body 10.4
makeup 10.1
job 9.7
bridal 9.7
work 9.4
youth 9.4
cute 9.3
skin 9.3
relax 9.3
business 9.1
pose 9.1
cheerful 8.9
child 8.9
working 8.8
looking 8.8
look 8.8
groom 8.6
smasher 8.6
casual 8.5
relaxation 8.4
studio 8.4
hand 8.4
indoor 8.2
celebration 8
happiness 7.8
modern 7.7
married 7.7
room 7.6
head 7.6
bouquet 7.5
dark 7.5
cosmetics 7.5
holding 7.4
sensuality 7.3
make 7.3
office 7.2
performer 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

wall 98.4
person 96.7
text 87.5
cat 69.3
animal 64.8
human face 62
smile 61.9
clothing 60.1
posing 53.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 73%
Surprised 0.4%
Disgusted 0.1%
Sad 0.5%
Angry 0.2%
Calm 0.6%
Happy 97.9%
Fear 0.2%
Confused 0.1%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.3%
Dog 68.4%
Bear 65.2%