Human Generated Data

Title

Untitled (woman playing catch with a small child at the beach)

Date

c. 1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10480

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing catch with a small child at the beach)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10480

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.2
Person 99.2
Person 98.4
Nature 98
Outdoors 94.9
Person 92.4
Person 88.5
Person 87.6
Tree 81.5
Plant 81.5
Weather 80.9
Person 66
Transportation 65.3
Vehicle 65.3
Ice 63.3
Machine 63.2
Person 62.4
Tire 61.2
Spoke 61.1
Airplane 60.9
Aircraft 60.9
Road 56.5
Person 44.3

Clarifai
created on 2023-10-25

people 98.6
vehicle 97.3
watercraft 95.5
monochrome 95
group 94.2
transportation system 93.2
smoke 92.7
adult 92.2
no person 88.3
man 87.9
many 86.1
war 83.5
military 83.3
ship 81.4
art 80.6
group together 79.3
water 77.8
aircraft 76.5
music 74.7
tree 73.5

Imagga
created on 2022-01-09

car mirror 44.7
mirror 35.3
sky 31.3
billboard 29.7
reflector 26.7
cloud 26.7
landscape 26
structure 25.5
signboard 24.1
blackboard 21.7
clouds 20.3
water 19.3
stage 19
night 16.9
television 16.3
travel 16.2
sunset 15.3
platform 14.6
sea 14.1
light 14
environment 14
smoke 14
power 13.4
factory 12.5
pollution 12.5
scenic 12.3
summer 12.2
sun 12.1
danger 11.8
industrial 11.8
beach 11.8
dark 11.7
ocean 11.6
scene 11.3
sunrise 11.2
outdoor 10.7
storm 10.6
industry 10.3
black 10.2
lake 10.2
architecture 10.2
city 10
scenery 9.9
horizon 9.9
coast 9.9
steam 9.7
chemical 9.7
fog 9.6
dusk 9.5
air 9.2
tourism 9.1
dirty 9
outdoors 9
river 8.9
chimney 8.8
toxic 8.8
day 8.6
nobody 8.6
building 8.5
tree 8.5
cloudy 8.4
boat 8.4
silhouette 8.3
weather 8.2
vacation 8.2
film 8.1
natural 8
broadcasting 8
negative 7.7
screen 7.6
energy 7.6
evening 7.5
tranquil 7.2
color 7.2
trees 7.1

Google
created on 2022-01-09

Black 89.6
Sky 89.2
Cloud 88.7
Plant 88.7
Drum 86.9
Arecales 84.3
Style 83.8
Black-and-white 83.4
Rectangle 81.2
Tree 80.8
Font 79.9
Adaptation 79.3
Tints and shades 77.4
Monochrome photography 76.1
Monochrome 75.4
Grass 74
Chair 72.1
Landscape 71.1
Palm tree 70.5
Event 69.5

Microsoft
created on 2022-01-09

text 99.7
outdoor 93.9
black and white 84.3
old 84.1
sky 73
cloud 70
steam 59.3
engine 40.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-24
Gender Male, 93.9%
Calm 83%
Sad 6.7%
Happy 2.8%
Disgusted 2.7%
Angry 1.8%
Confused 1.2%
Fear 1%
Surprised 0.7%

AWS Rekognition

Age 20-28
Gender Female, 71.9%
Happy 34.8%
Sad 20.5%
Calm 19.2%
Fear 8.4%
Disgusted 8.2%
Surprised 3.7%
Confused 3.3%
Angry 1.9%

AWS Rekognition

Age 9-17
Gender Male, 96.6%
Calm 98.2%
Happy 0.9%
Sad 0.3%
Angry 0.1%
Surprised 0.1%
Fear 0.1%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 13-21
Gender Male, 86.4%
Calm 52.4%
Fear 26.7%
Sad 5.5%
Surprised 4.2%
Angry 3.5%
Disgusted 3.1%
Happy 3.1%
Confused 1.6%

Feature analysis

Amazon

Person 99.2%
Airplane 60.9%

Categories

Imagga

text visuals 98.4%

Captions

Text analysis

Amazon

44585

Google

牛4S9S
4S9S