Human Generated Data

Title

Untitled (woman walking on sidewalk with three children)

Date

1953, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.184

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman walking on sidewalk with three children)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.184

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.9
Apparel 99.9
Tree 99.8
Plant 99.8
Person 99.6
Human 99.6
Palm Tree 99.3
Arecaceae 99.3
Shoe 99.2
Footwear 99.2
Person 99.1
Person 98.8
Female 97.3
Skirt 95.8
Shorts 95.3
Neighborhood 93.6
Building 93.6
Urban 93.6
Shoe 90.9
Woman 90.9
People 75.1
Shoe 75.1
Vegetation 70.3
Face 67.6
Portrait 66.3
Photography 66.3
Photo 66.3
Summer 65.4
Road 60.8
Grass 57.6
Girl 57.3
Outdoors 57.2
Villa 55
Housing 55
House 55

Clarifai
created on 2023-10-25

people 99.9
child 99.8
group 98.8
two 98.7
group together 98.5
adult 96.4
three 96.3
monochrome 95.9
sibling 95.7
four 95.6
woman 94.5
offspring 94.4
recreation 94.3
family 93.2
wear 92.3
several 91.1
street 91
home 90.4
boy 90.1
portrait 89.7

Imagga
created on 2022-01-08

pedestrian 36.2
man 25.5
sport 24.9
person 22.8
people 22.3
city 20
outdoors 19.8
street 16.6
adult 16.2
male 15.7
outdoor 14.5
walking 14.2
leisure 14.1
fun 13.5
lifestyle 12.3
child 11.9
portrait 11.6
activity 11.6
summer 11.6
statue 11.3
couple 11.3
men 11.2
outside 11.1
women 11.1
active 11
crutch 10.9
dress 10.8
golf 10.5
pretty 10.5
one 10.5
ball 10.4
senior 10.3
park 9.9
recreation 9.9
human 9.7
lady 9.7
black 9.6
urban 9.6
day 9.4
model 9.3
travel 9.2
exercise 9.1
fashion 9
performer 9
game 8.9
happy 8.8
grass 8.7
standing 8.7
love 8.7
play 8.6
staff 8.5
action 8.3
stick 8.3
sky 8.3
road 8.1
mask 8.1
athlete 7.7
attractive 7.7
walk 7.6
swing 7.6
musical instrument 7.5
joy 7.5
world 7.5
traditional 7.5
building 7.5
teenager 7.3
protection 7.3
happiness 7.1

Google
created on 2022-01-08

Footwear 98
Plant 96.4
Tree 90.3
Dress 89.7
Black-and-white 85.8
Happy 85.4
Gesture 85.3
Style 84
People in nature 83.8
Adaptation 79.3
Toddler 76.8
Monochrome 75.9
Monochrome photography 75.7
Fun 74.2
Sky 72.6
Vintage clothing 71.8
Child 71.7
Event 71.1
Palm tree 69.1
Room 67.1

Microsoft
created on 2022-01-08

outdoor 99.4
sky 99
tree 98.1
clothing 91.4
footwear 90
text 87
person 83.7
dress 70.7
posing 66.7
black and white 61.4
way 41.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 100%
Happy 97.9%
Calm 0.7%
Confused 0.5%
Sad 0.3%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 6-14
Gender Female, 99.9%
Happy 87.9%
Confused 3.5%
Disgusted 1.9%
Surprised 1.9%
Sad 1.7%
Angry 1.5%
Fear 1.1%
Calm 0.4%

AWS Rekognition

Age 2-8
Gender Male, 99.7%
Happy 70.6%
Calm 26.5%
Confused 1.4%
Sad 0.8%
Disgusted 0.2%
Surprised 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 43-51
Gender Female, 100%
Happy 91.6%
Surprised 2.8%
Angry 1.8%
Confused 1.5%
Fear 0.8%
Sad 0.5%
Disgusted 0.5%
Calm 0.4%

Microsoft Cognitive Services

Age 44
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 99.2%