Human Generated Data

Title

Untitled (two girls in matching dresses, holding hands, standing in front of trees by urn)

Date

c. 1940

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12724

Human Generated Data

Title

Untitled (two girls in matching dresses, holding hands, standing in front of trees by urn)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12724

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.5
Human 99.5
Person 99.3
Female 91.2
Shoe 87.2
Footwear 87.2
Plant 80.5
Tree 79.8
Woman 79.2
Vegetation 79.2
Bush 76.6
Helmet 73
Shoe 65.6
Path 64.4
Skirt 63.2
Cat 62.8
Animal 62.8
Mammal 62.8
Pet 62.8
Outdoors 61.2
Shoe 52.8
Shoe 52.2

Clarifai
created on 2023-10-29

people 99.9
two 98.6
adult 98.4
woman 98.3
child 98.2
wear 95.6
man 92.2
portrait 90
offspring 88
group 87.1
one 86.6
three 86.1
wedding 85.4
monochrome 85.3
dress 84.7
administration 83.9
four 83.8
son 82.2
actress 80.3
street 80.1

Imagga
created on 2022-02-04

swing 33.4
mechanical device 27.3
plaything 27.2
people 21.2
person 20.8
mechanism 20.3
man 18.8
sexy 18.5
adult 18.2
body 16.8
posing 15.1
fashion 15.1
model 14.8
sunset 14.4
male 14.3
hair 14.3
silhouette 14.1
attractive 14
black 13.6
dark 13.4
couple 12.2
lady 12.2
love 11.8
child 11.2
women 11.1
skin 11
portrait 11
dress 10.8
human 10.5
one 10.5
wall 10.3
happy 10
outdoor 9.9
style 9.6
walking 9.5
light 9.4
water 9.3
musical instrument 9.2
pretty 9.1
sensual 9.1
sport 9.1
park 9.1
active 9
world 8.6
beach 8.4
summer 8.4
outdoors 8.3
dirty 8.1
sun 8.1
dance 8
lifestyle 8
performer 7.9
autumn 7.9
happiness 7.8
dancer 7.8
outfit 7.7
clothing 7.6
legs 7.5
enjoy 7.5
passion 7.5
fun 7.5
leisure 7.5
future 7.4
vintage 7.4
action 7.4
street 7.4
alone 7.3
sensuality 7.3
pose 7.2
sky 7

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 98
clothing 96.2
person 92.2
black and white 91.8
woman 88.7
footwear 82.9
dress 82.3
monochrome 70.3
picture frame 14.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 94.7%
Calm 94.3%
Happy 2.4%
Sad 2%
Surprised 0.7%
Confused 0.2%
Fear 0.2%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 81.7%
Calm 92.2%
Sad 2.6%
Fear 2.1%
Happy 1.6%
Surprised 0.8%
Disgusted 0.3%
Angry 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Helmet
Cat
Person 99.5%
Person 99.3%
Shoe 87.2%
Shoe 65.6%
Shoe 52.8%
Shoe 52.2%
Helmet 73%
Cat 62.8%