Human Generated Data

Title

Untitled (woman with two young daughters standing in front of iron railing on porch)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12785

Human Generated Data

Title

Untitled (woman with two young daughters standing in front of iron railing on porch)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12785

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Clothing 99.3
Apparel 99.3
Human 99.2
Person 99.2
Person 98.6
Person 98.5
Coat 90
Overcoat 90
Outdoors 88.1
Home Decor 87.2
Nature 79.7
Footwear 78.2
Shoe 78.2
Snow 75.2
Female 67.7
Dress 66.8
Plant 61.7
Paper 61.5
Pants 58.8
Suit 58.1
Ice 58

Clarifai
created on 2019-11-16

people 99.9
adult 98.3
woman 97.9
wear 97.7
two 96.2
group 95.8
man 93.2
group together 92.5
dress 92.4
child 90.8
actress 89.7
street 89
outfit 88.8
wedding 87.9
fashion 86.7
one 85.3
many 85
monochrome 83.9
three 81.1
girl 80.9

Imagga
created on 2019-11-16

sax 25.6
person 24.7
man 21.7
people 21.2
black 18
musical instrument 15.8
portrait 15.5
male 14.2
wind instrument 14
planner 13.8
attractive 13.3
adult 12.6
lady 12.2
grunge 11.9
drawing 11.7
couple 11.3
fashion 11.3
happy 10.6
outdoors 10.4
style 10.4
city 10
dress 9.9
sexy 9.6
lifestyle 9.4
smile 9.3
business 9.1
silhouette 9.1
pretty 9.1
sketch 9.1
old 9.1
stringed instrument 9
businessman 8.8
standing 8.7
play 8.6
bass 8.6
bowed stringed instrument 8.6
model 8.5
cello 8.5
studio 8.4
active 8.1
office 8
chair 8
looking 8
job 8
musician 7.9
cute 7.9
singer 7.8
men 7.7
winter 7.7
outdoor 7.6
power 7.6
human 7.5
one 7.5
holding 7.4
park 7.4
street 7.4
professional 7.3
smiling 7.2
work 7.2
face 7.1
day 7.1
sport 7
sky 7

Google
created on 2019-11-16

Photograph 97.4
Black-and-white 87.9
Snapshot 87.3
Standing 85.8
Photography 77
Picture frame 75.7
Tree 74.7
Stock photography 72.2
Monochrome 64.3
Plant 58.5
Window 53.8
Style 53.5
Art 50.2

Microsoft
created on 2019-11-16

clothing 95.2
person 92.7
text 90.5
standing 87.6
dress 82.8
woman 73.6
footwear 73.1
posing 65.2
white 64.6
black and white 62.6
store 35.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 52.7%
Happy 45.1%
Disgusted 45%
Confused 45.1%
Calm 54.7%
Fear 45%
Sad 45.1%
Angry 45.1%
Surprised 45%

AWS Rekognition

Age 4-14
Gender Female, 54.8%
Sad 45%
Calm 45.1%
Angry 45%
Fear 45%
Happy 54.8%
Surprised 45%
Disgusted 45%
Confused 45.1%

AWS Rekognition

Age 1-7
Gender Female, 54.5%
Fear 45.2%
Surprised 45.1%
Angry 45.3%
Sad 47.6%
Disgusted 45%
Happy 50.8%
Calm 45.9%
Confused 45.1%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 4
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Shoe 78.2%

Categories

Text analysis

Google

HHIPHHA
HHIPHHA