Human Generated Data

Title

Untitled (nuns sitting near the water)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7795

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (nuns sitting near the water)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7795

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Nature 95.6
Outdoors 93.7
Person 92.7
Human 92.7
Person 91.5
Person 90.7
Person 88.5
Person 87.3
Person 77.8
Clothing 76
Apparel 76
Person 73.5
Ice 67
Weather 65.5
Person 64.9
Person 58.7
Snow 56.8
Sea 55.1
Water 55.1
Ocean 55.1

Clarifai
created on 2023-10-25

monochrome 99.5
people 98.2
winter 95.6
snow 95.2
sea 94.9
beach 94.2
group 94.1
ice 92.6
man 90.4
ocean 89.5
cold 87.7
water 86.6
adult 85.6
black and white 84.6
seashore 84.5
rock 83.2
group together 83.1
nature 83.1
frosty 82.5
many 77.1

Imagga
created on 2022-01-09

ocean 37.7
sea 37.6
beach 37.3
sand 36.1
robe 35.1
sky 30.1
water 28
garment 27.3
coast 27
landscape 23.8
shore 22.5
summer 21.9
rock 21.7
outdoor 21.4
travel 21.1
clothing 17.6
seaside 17
tourism 16.5
sunset 16.2
sun 16.1
clouds 15.2
coastline 15.1
cow 14.6
natural 14.1
scenic 14.1
vacation 13.9
island 13.7
shoreline 13.5
seascape 13.4
silhouette 13.3
waves 13
people 12.8
tourist 12.8
man 12.8
outdoors 12.7
mountain 12.5
sunrise 12.2
weather 12
scenery 11.7
soil 11.2
person 11.1
animals 11.1
calm 11
coastal 10.7
rural 10.6
covering 10.4
camel 10.4
rocks 10.4
cattle 10.3
season 10.1
relax 10.1
color 10
male 10
stone 10
earth 9.7
bay 9.4
outside 9.4
hill 9.4
evening 9.3
lake 9.2
peace 9.1
park 9.1
horizon 9
consumer goods 8.9
surf 8.7
fishing 8.7
day 8.6
holiday 8.6
colorful 8.6
dusk 8.6
snow 8.4
leisure 8.3
group 8.1
farm 8
white 7.8
pier 7.8
wave 7.8
sunny 7.8
cloud 7.8
men 7.7
tree 7.7
dog 7.5
boat 7.4
light 7.4
peaceful 7.3
fisherman 7.3
ranch 7
cliff 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 99.4
text 98.1
snow 92.1
black and white 86.7
beach 73.3
water sport 60.5
water 59.8
mountain 50.4
image 38.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 72%
Calm 52.3%
Surprised 23.6%
Angry 9.7%
Happy 6.9%
Sad 2.9%
Disgusted 2%
Fear 1.3%
Confused 1.3%

AWS Rekognition

Age 20-28
Gender Female, 81%
Happy 26.6%
Calm 21.3%
Sad 17.9%
Confused 11.8%
Fear 7.3%
Angry 6.3%
Disgusted 5%
Surprised 3.8%

AWS Rekognition

Age 28-38
Gender Female, 50.5%
Happy 85.1%
Sad 8.3%
Fear 2.4%
Confused 1.2%
Calm 1.1%
Surprised 1%
Disgusted 0.7%
Angry 0.4%

AWS Rekognition

Age 27-37
Gender Female, 59.4%
Calm 74.6%
Fear 12.1%
Angry 4%
Disgusted 2.6%
Sad 2.5%
Happy 2%
Surprised 1.6%
Confused 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Feature analysis

Amazon

Person 92.7%

Categories

Text analysis

Amazon

41390