Human Generated Data

Title

Untitled (Steinmetz's with Cardozos family standing under tree)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8170

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Steinmetz's with Cardozos family standing under tree)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8170

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.9
Human 99.9
Clothing 99.9
Apparel 99.9
Person 99.8
Person 99.7
Person 99.6
Person 99.5
Dress 98.7
Person 92.7
Female 90.1
Face 89.7
Grass 86.9
Plant 86.9
Yard 86.2
Outdoors 86.2
Nature 86.2
Woman 78
Vegetation 77.8
People 73.6
Shorts 72.5
Robe 70.4
Fashion 70.4
Tree 68.9
Pants 68.7
Gown 66.5
Portrait 64.7
Photography 64.7
Photo 64.7
Text 61
Skirt 60.7
Smile 59.7
Man 58.4
Land 58.2
Standing 58.2
Wedding 58.1
Wedding Gown 55.1

Clarifai
created on 2023-10-25

people 100
group 99.1
adult 99.1
group together 97.6
child 97.2
man 96.4
administration 96.2
woman 96
wear 94.7
portrait 93.4
family 93.1
offspring 90.7
leader 90.3
war 89.9
many 88
three 87.6
several 87.6
four 87.2
monochrome 86.5
sibling 85.6

Imagga
created on 2022-01-08

kin 48.7
people 22.3
male 19.2
man 18.8
person 17.9
world 16.1
silhouette 15.7
sky 15.3
summer 14.8
water 13.3
couple 13.1
groom 12.8
sunset 12.6
happiness 12.5
beach 11.9
love 11.8
sun 11.3
men 11.2
landscape 11.2
child 11.1
sea 10.9
dark 10.8
symbol 10.1
happy 10
park 9.9
travel 9.9
outdoors 9.8
old 9.7
light 9.6
black 9.6
bride 9.6
two 9.3
ocean 9.1
sport 8.8
adult 8.7
grass 8.7
portrait 8.4
wedding 8.3
tourism 8.2
room 8.2
coast 8.1
sunlight 8
life 7.9
clouds 7.6
art 7.6
field 7.5
leisure 7.5
girls 7.3
success 7.2
road 7.2
active 7.2
history 7.2
women 7.1

Microsoft
created on 2022-01-08

grass 99.4
text 99.3
person 97.6
outdoor 97.3
clothing 97.2
smile 94.4
standing 93
posing 91.8
dress 88.9
man 84.9
woman 83.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 85.2%
Happy 95.2%
Calm 2%
Sad 1.1%
Confused 0.8%
Surprised 0.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 30-40
Gender Male, 82.5%
Calm 98.6%
Sad 0.7%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%
Confused 0.1%
Surprised 0%
Fear 0%

AWS Rekognition

Age 35-43
Gender Female, 93.5%
Calm 98.9%
Happy 0.4%
Sad 0.3%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Angry 0%
Fear 0%

AWS Rekognition

Age 30-40
Gender Male, 97.8%
Calm 60.9%
Happy 38.6%
Surprised 0.3%
Sad 0.1%
Disgusted 0.1%
Fear 0.1%
Confused 0%
Angry 0%

AWS Rekognition

Age 54-62
Gender Male, 67.9%
Calm 65.2%
Happy 33.4%
Confused 0.4%
Disgusted 0.3%
Surprised 0.2%
Angry 0.1%
Sad 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.9%

Categories

Text analysis

Amazon

of
390
390 8
8
OUT of FOCUS.
FOCUS.
DVD
OUT

Google

-OUT
$.
39084 -OUT of Foc $.
39084
of
Foc