Human Generated Data

Title

Untitled (family portrait in yard near palm trees)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8709

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family portrait in yard near palm trees)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8709

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 98.9
Person 98.8
Person 97.7
Person 97.1
Clothing 96.2
Apparel 96.2
Shorts 92.9
Female 90.9
Face 90.4
Blonde 86
Kid 86
Girl 86
Teen 86
Woman 86
Child 86
Dress 85.8
Plant 85.6
Smile 84
People 80.3
Tree 78
Play 77.2
Leisure Activities 75
Poster 70.8
Advertisement 70.8
Art 67.9
Grass 67.1
Person 65.7
Photography 65.7
Photo 65.7
Portrait 65.2
Outdoors 63.4
Drawing 63.4
Collage 59.8
Vacation 59.1
Boy 58.1
Yard 57.3
Nature 57.3

Clarifai
created on 2023-10-25

people 99.9
child 98.9
group 97.9
group together 93.7
adult 93.6
boy 92.2
son 91.1
man 91.1
family 91
outfit 90.9
wear 90.7
war 89.6
recreation 89.4
leader 89.1
portrait 88.1
several 87.4
woman 86.9
nostalgic 86.7
music 86.3
two 85.3

Imagga
created on 2022-01-09

kin 25.6
grunge 16.2
negative 16.1
vintage 14.9
old 14.6
portrait 14.2
newspaper 13.4
architecture 13.3
art 13
film 12.9
building 12.4
dirty 11.7
dress 11.7
people 11.7
person 11.4
antique 11.2
black 10.8
product 10.3
child 10.3
decoration 10.1
retro 9.8
ancient 9.5
man 9.4
photographic paper 9.2
mother 9
structure 8.7
culture 8.5
face 8.5
design 8.4
power 8.4
dark 8.3
traditional 8.3
city 8.3
style 8.2
aged 8.1
creation 8
mask 7.9
color 7.8
tree 7.8
sport 7.7
outdoor 7.6
costume 7.6
statue 7.6
head 7.6
lady 7.3
detail 7.2
stone 7.2
sexy 7.2
love 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 91.8
text 89.1
person 86.5
outdoor 85.4
child 80.1
drawing 75.3
footwear 71.8
black and white 69.5
human face 57.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 54%
Happy 93.6%
Calm 5.8%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 16-22
Gender Female, 95.9%
Happy 81.7%
Sad 10.6%
Fear 5.2%
Angry 0.8%
Disgusted 0.7%
Calm 0.5%
Surprised 0.3%
Confused 0.2%

AWS Rekognition

Age 41-49
Gender Male, 73.8%
Calm 97.9%
Sad 0.6%
Happy 0.5%
Angry 0.3%
Disgusted 0.3%
Confused 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 34-42
Gender Female, 59.6%
Calm 68.2%
Surprised 11.8%
Happy 10.9%
Sad 6.4%
Confused 0.9%
Disgusted 0.8%
Angry 0.6%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Text analysis

Amazon

362
VAGOY