Human Generated Data

Title

Untitled (School May Day performance: girls with dolls walking in a line around a cricle)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4593

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (School May Day performance: girls with dolls walking in a line around a cricle)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4593

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Nature 98
Outdoors 97.7
Person 95.8
Human 95.8
Clothing 95
Apparel 95
Person 92.5
Person 91.4
Person 90.5
Tent 88.1
Snow 88.1
Person 83.3
Person 82.2
Person 80.9
Person 80.7
Person 79.4
Person 78.2
Winter 76.8
Person 72.3
Weather 72.2
People 71.4
Person 69.4
Female 68
Person 64.9
Coat 62.8
Overcoat 61.3
Person 60.8
Ice 59.1
Face 58.5
Storm 58.3
Suit 56.2
Blizzard 55.8

Clarifai
created on 2023-10-15

people 99.6
adult 98.1
group 98.1
tent 97.1
many 96.5
man 94.4
woman 93.2
wear 93.1
group together 91.9
veil 91.2
illustration 89.4
campsite 89.4
military 84.4
leader 84.3
several 83.8
print 83.2
cavalry 82.5
art 82.1
wedding 80.4
crowd 80.2

Imagga
created on 2021-12-14

snow 26
sky 25.5
structure 23
winter 21.3
sea 21.1
canvas tent 20.9
sand 20.5
landscape 19.3
travel 18.3
ocean 16.9
water 16
shelter 15.8
beach 15.7
cold 15.5
mountain tent 15.4
cemetery 15.3
outdoor 14.5
season 14
old 13.9
scene 13.8
tent 13.2
black 13.2
tourism 13.2
summer 12.2
man 11.4
weather 11.1
clouds 11
coast 10.8
vacation 10.6
sun 10.5
stone 10.5
tree 10.1
scenery 9.9
religion 9.9
history 9.8
outdoors 9.8
trees 9.8
scenic 9.7
ice 9.6
people 9.5
silhouette 9.1
vintage 9.1
gravestone 8.9
country 8.8
snowy 8.8
hut 8.7
forest 8.7
building 8.7
fog 8.7
person 8.7
cloud 8.6
tourist 8.6
architecture 8.6
picket fence 8.4
house 8.4
church 8.3
lake 8.2
peaceful 8.2
sunset 8.1
rural 7.9
grass 7.9
day 7.8
wall 7.8
memorial 7.8
port 7.7
boat 7.7
frost 7.7
frozen 7.6
negative 7.6
fence 7.5
wood 7.5
desert 7.5
leisure 7.5
town 7.4
park 7.4
light 7.4
morning 7.2
holiday 7.2
river 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 88.9
wedding dress 77
black 76
white 70.9
black and white 61.1
drawing 60.1
sketch 58.1
old 52.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-40
Gender Female, 78.3%
Sad 94.2%
Fear 3.6%
Calm 1.7%
Confused 0.2%
Happy 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 21-33
Gender Female, 75.9%
Happy 52.8%
Calm 25.3%
Surprised 9.4%
Sad 4.9%
Disgusted 3%
Fear 2.7%
Angry 1.2%
Confused 0.6%

AWS Rekognition

Age 26-40
Gender Female, 64.9%
Calm 87.5%
Happy 9.5%
Sad 1.7%
Angry 0.4%
Surprised 0.3%
Disgusted 0.3%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 30-46
Gender Male, 90.4%
Surprised 34.1%
Fear 30.5%
Happy 21.6%
Angry 8.8%
Calm 3.2%
Sad 0.8%
Confused 0.6%
Disgusted 0.4%

AWS Rekognition

Age 12-22
Gender Male, 75.9%
Surprised 37.3%
Fear 21%
Happy 15.4%
Calm 11.5%
Sad 8.3%
Confused 3.1%
Angry 2.2%
Disgusted 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.8%
Tent 88.1%

Categories

Imagga

paintings art 96.5%
nature landscape 2.5%

Text analysis

Amazon

16509.

Google

I6509. 16509.
I6509.
16509.