Human Generated Data

Title

Untitled (family portrait in yard near palm trees)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8708

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family portrait in yard near palm trees)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8708

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.4
Person 99.1
Clothing 98.9
Apparel 98.9
Dress 97.6
Grass 95.9
Plant 95.9
Person 95
Female 93.6
Face 90.5
Shorts 90.2
Play 84.1
Kid 83.3
Child 83.3
Yard 77.3
Nature 77.3
Outdoors 77.3
Girl 76.7
Collage 75.2
Poster 75.2
Advertisement 75.2
People 72.4
Woman 72.2
Tree 70.3
Portrait 68.2
Photography 68.2
Photo 68.2
Baby 67.1
Blonde 66.8
Teen 66.8
Furniture 57.8
Costume 56.1
Person 48.8

Clarifai
created on 2023-10-25

people 99.9
group 99
child 98.5
group together 97.5
adult 96.2
wear 95.6
administration 95.3
man 92.8
woman 92
boy 91.8
several 91.7
war 90
family 89.5
outfit 88.5
portrait 88.5
leader 87.1
monochrome 87.1
music 86.6
son 86.5
many 86.3

Imagga
created on 2022-01-09

cemetery 26.6
people 16.7
man 14.1
grunge 12.8
gravestone 12.3
stone 12.2
structure 11.7
person 11.7
outdoor 11.5
autumn 11.4
art 11.1
portrait 11
memorial 10.5
old 10.4
sexy 10.4
adult 10.3
dirty 9.9
body 9.6
love 9.5
color 9.4
male 9.3
vintage 9.1
danger 9.1
dress 9
history 8.9
sport 8.9
lady 8.9
style 8.9
tree 8.8
child 8.7
black 8.4
pretty 8.4
attractive 8.4
dark 8.3
fashion 8.3
face 7.8
model 7.8
fear 7.7
human 7.5
water 7.3
building 7.3
swing 7.2
lifestyle 7.2
kin 7.1
posing 7.1
summer 7.1
day 7.1
architecture 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

grass 95.7
outdoor 92.5
clothing 86.6
person 78.6
text 78
black and white 71.1
child 58.4
toddler 51.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 66.5%
Calm 98.8%
Sad 0.6%
Happy 0.3%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 6-12
Gender Female, 95.8%
Calm 98.5%
Sad 0.5%
Surprised 0.4%
Happy 0.2%
Fear 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

AWS Rekognition

Age 16-22
Gender Female, 63.3%
Happy 96.7%
Surprised 1.2%
Calm 0.6%
Fear 0.5%
Sad 0.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%

AWS Rekognition

Age 19-27
Gender Male, 94.1%
Happy 54.4%
Calm 37.1%
Disgusted 3.1%
Sad 2.6%
Confused 1%
Angry 0.8%
Surprised 0.5%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

36305