Human Generated Data

Title

Untitled (twin girls in matching outfits on porch, standing and sitting against column)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12970

Human Generated Data

Title

Untitled (twin girls in matching outfits on porch, standing and sitting against column)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12970

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Apparel 99.7
Clothing 99.7
Shorts 99.7
Human 99.2
Person 99.2
Person 97.4
Building 92.3
Architecture 92.3
Pillar 79.9
Column 79.9
Finger 69.2
Plant 68.8
Tree 68.8
Grass 61.2
Pants 58.5
Flooring 58.2
Female 56.6
Girl 56.2
Banister 55.7
Handrail 55.7
Sleeve 55.1

Clarifai
created on 2019-11-16

people 99.8
monochrome 98.6
woman 97.7
adult 97.6
two 96.1
portrait 95.6
girl 94.8
child 93.6
man 93.2
one 93
street 92.3
wedding 90.5
black and white 88.1
couple 86.9
actress 86.5
wear 85.2
love 80.4
boy 79.1
position 78.6
actor 78.4

Imagga
created on 2019-11-16

statue 36.7
sculpture 28.4
art 18.1
architecture 14.8
man 14.3
travel 14.1
monument 14
religion 13.4
city 12.5
stone 11.9
person 11.5
old 11.1
portrait 11
decoration 10.9
tourism 10.7
wheeled vehicle 10.7
historical 10.3
culture 10.3
tricycle 10.2
sky 10.2
figure 10.1
support 10
building 9.7
people 9.5
mask 9.4
adult 9.1
history 8.9
detail 8.8
antique 8.8
body 8.8
child 8.7
male 8.7
ancient 8.6
black 8.4
famous 8.4
traditional 8.3
historic 8.2
protection 8.2
dirty 8.1
landmark 8.1
outdoors 7.9
marble 7.7
vehicle 7.5
park 7.4
structure 7.4
lady 7.3
danger 7.3
sport 7.3
dress 7.2
women 7.1
face 7.1
swing 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 92.8
person 90.2
outdoor 90.1
toddler 87.8
baby 84.6
human face 81.1
text 73
girl 70.6
child 66.5
black and white 66.3
smile 62.2
posing 48

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 5-15
Gender Female, 91.2%
Calm 99.1%
Fear 0%
Confused 0.1%
Surprised 0.1%
Angry 0.3%
Happy 0.1%
Sad 0.2%
Disgusted 0.1%

AWS Rekognition

Age 9-19
Gender Female, 53.9%
Confused 45%
Happy 45%
Disgusted 45%
Calm 54.8%
Angry 45%
Sad 45.1%
Surprised 45%
Fear 45%

Microsoft Cognitive Services

Age 29
Gender Female

Microsoft Cognitive Services

Age 12
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Imagga

paintings art 98.6%