Human Generated Data

Title

Untitled (woman with three children walking on sidewalk)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8786

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman with three children walking on sidewalk)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8786

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.7
Person 99.7
Person 99.6
Person 99.4
Apparel 98.6
Clothing 98.6
Shorts 95.3
Shoe 86.2
Footwear 86.2
Home Decor 84.3
Outdoors 83.3
Female 82.9
Nature 78.4
Face 78.1
Tree 78
Plant 78
Skin 77.5
Shoe 77.3
People 76.1
Asphalt 74.7
Tarmac 74.7
Building 74.6
Shelter 69.7
Countryside 69.7
Rural 69.7
Woman 69.6
Shoe 69.2
Road 68.5
Path 65.6
Skirt 61.6
Urban 58

Clarifai
created on 2023-10-26

people 99.9
adult 98.3
group together 98.3
child 98.3
many 96.7
group 96.6
woman 96.6
street 95.5
man 95.3
wear 94.8
monochrome 93.8
two 92.9
recreation 90.8
several 90.1
one 85.6
athlete 83.9
three 82
five 81.2
outfit 80.6
art 80.5

Imagga
created on 2022-01-09

person 26.5
people 21.8
fountain 21.4
man 18.8
world 16.5
adult 16.3
structure 15.6
one 14.2
black 14.1
lady 13.8
fashion 13.6
human 13.5
silhouette 13.2
model 13.2
dark 11.7
portrait 11.6
city 11.6
body 11.2
sport 11.1
style 11.1
dress 10.8
happy 10.7
male 10.6
outdoors 10.4
pretty 9.8
dinner dress 9.7
sexy 9.6
urban 9.6
hair 9.5
walking 9.5
love 9.5
symbol 9.4
business 9.1
sunset 9
businessman 8.8
player 8.8
stadium 8.8
art 8.5
attractive 8.4
park 8.3
event 8.3
clothing 8.3
street 8.3
fun 8.2
suit 8.2
performer 8.2
water 8
travel 7.7
summer 7.7
crowd 7.7
sky 7.7
legs 7.5
lights 7.4
training 7.4
stage 7.4
competition 7.3
design 7.3
alone 7.3
indoor 7.3
sensuality 7.3
sun 7.2
mask 7.2
wet 7.2
romantic 7.1
athlete 7.1
posing 7.1
night 7.1
face 7.1
sax 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.9
outdoor 97.6
black and white 90.7
tree 78.2
footwear 76.6
clothing 71.9
person 58.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 99.7%
Sad 42.1%
Calm 36.8%
Happy 13.6%
Fear 2%
Surprised 2%
Confused 1.6%
Disgusted 1.1%
Angry 0.7%

AWS Rekognition

Age 25-35
Gender Female, 54.8%
Calm 75.7%
Happy 17.3%
Sad 3.5%
Fear 1.1%
Surprised 1.1%
Confused 0.5%
Disgusted 0.4%
Angry 0.4%

AWS Rekognition

Age 26-36
Gender Female, 92.7%
Calm 63.5%
Sad 30.5%
Happy 2.8%
Angry 0.8%
Confused 0.7%
Surprised 0.6%
Fear 0.6%
Disgusted 0.5%

AWS Rekognition

Age 45-51
Gender Male, 94.8%
Surprised 65.9%
Calm 19%
Sad 7.9%
Happy 3.8%
Confused 1.7%
Angry 0.6%
Disgusted 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 86.2%

Text analysis

Amazon

39314.
8SA
CoVEETA
RODVR- CoVEETA
RODVR-

Google

39314.
39314.