Human Generated Data

Title

Untitled (School May Day performance: children gathered around elevated girl with flowers)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4594

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (School May Day performance: children gathered around elevated girl with flowers)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4594

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Person 98.6
Person 98.4
Person 96.6
Person 96
Clothing 95.4
Apparel 95.4
Person 93.8
Person 89.8
Face 89.1
People 88.6
Person 88.1
Person 87.3
Female 86.1
Smile 84.2
Plant 83.5
Tree 82.4
Dress 81.5
Person 75.6
Coat 75.5
Overcoat 75.5
Suit 75.5
Person 73
Outdoors 71.9
Yard 71.2
Nature 71.2
Girl 70.7
Kid 64.3
Child 64.3
Woman 64
Photography 63.6
Photo 63.6
Person 63
Play 59.6
Leisure Activities 58.4
Fashion 57.5
Gown 57.5
Family 57.3
Shorts 57
Flower 56.6
Blossom 56.6
Flower Arrangement 56.6
Grass 56.4
Meal 55.9
Picnic 55.9
Food 55.9
Vacation 55.9
Person 55.8
Toy 55.7

Clarifai
created on 2023-10-15

people 100
child 99
group 98.7
many 98.2
adult 97.7
group together 94.8
war 94.7
military 92.9
man 92.5
woman 91.2
administration 91
soldier 89.9
boy 86.2
campsite 85.2
family 85.1
skirmish 83.7
wear 82.6
home 81.4
print 79.3
several 78.4

Imagga
created on 2021-12-14

fountain 43.1
structure 33.3
world 31.1
sky 19.1
old 18.1
architecture 14.8
man 14.8
silhouette 14.1
statue 13.9
history 13.4
black 13.2
people 12.8
city 12.5
outdoor 12.2
travel 12
person 12
landscape 11.9
sculpture 11.6
art 11.2
snow 11.1
stone 10.9
vintage 10.8
tourism 10.7
memorial 10.7
male 10.6
building 10.5
grunge 10.2
danger 10
dirty 9.9
sunset 9.9
religion 9.9
toxic 9.8
mask 9.6
light 9.6
symbol 9.4
day 9.4
park 9.4
famous 9.3
dark 9.2
outdoors 9.1
protection 9.1
chemical 8.7
water 8.7
winter 8.5
vacation 8.2
industrial 8.2
destruction 7.8
ancient 7.8
cold 7.8
men 7.7
fog 7.7
two 7.6
beach 7.6
power 7.6
historical 7.5
frame 7.5
smoke 7.4
environment 7.4
sun 7.2
color 7.2
landmark 7.2
adult 7.2
summer 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.3
outdoor 91.8
person 90.3
clothing 87.9
drawing 74.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 36-52
Gender Male, 91.7%
Calm 96.2%
Happy 2.9%
Surprised 0.3%
Sad 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0%

AWS Rekognition

Age 42-60
Gender Male, 96.1%
Happy 43.9%
Calm 39.4%
Angry 8.6%
Surprised 2.2%
Sad 2.2%
Confused 1.4%
Fear 1.3%
Disgusted 1.1%

AWS Rekognition

Age 30-46
Gender Female, 78.3%
Calm 62.1%
Happy 16.2%
Surprised 9.5%
Fear 5.2%
Angry 3%
Sad 2.1%
Disgusted 1.2%
Confused 0.7%

AWS Rekognition

Age 19-31
Gender Female, 79.8%
Happy 46.6%
Fear 18.8%
Surprised 14.4%
Calm 8.8%
Angry 5.6%
Sad 3.5%
Confused 1.6%
Disgusted 0.7%

AWS Rekognition

Age 40-58
Gender Male, 95.8%
Calm 93.9%
Happy 3.7%
Sad 1.5%
Angry 0.3%
Confused 0.2%
Surprised 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 21-33
Gender Female, 94.6%
Calm 87.3%
Sad 6.5%
Happy 2.9%
Fear 0.9%
Angry 0.8%
Surprised 0.7%
Confused 0.7%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 98.7%

Text analysis

Amazon

16503
16503.
-
:
,

Google

16503 16503.
16503
16503.