Human Generated Data

Title

Untitled (woman and child sitting on lawnchair)

Date

1959

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8098

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman and child sitting on lawnchair)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1959

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8098

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Nature 96.4
Outdoors 94.9
Person 92.2
Tree 90.3
Plant 90.3
Grass 89.8
Water 84.5
Canine 78
Animal 78
Mammal 78
Person 77.2
Clothing 75.4
Apparel 75.4
Vegetation 75.4
Chair 73.9
Furniture 73.9
Shorts 72.8
Building 72.8
Pet 72.6
Yard 66.5
Asphalt 65.2
Tarmac 65.2
Housing 65.1
Ice 64.4
Bench 60.7
Weather 59.3
Play 58.8
Female 58.4
Urban 57.4
Palm Tree 57
Arecaceae 57
Countryside 56.8

Clarifai
created on 2023-10-26

people 98.6
monochrome 98.2
tree 96.1
group 94.5
landscape 93.1
infrared 93.1
art 93
park 92.4
beach 92.3
black and white 91.4
adult 90.7
one 89.9
two 88.9
man 87.9
bench 87.6
cavalry 87.3
wear 86.7
fence 86.6
street 85.5
vehicle 84.9

Imagga
created on 2022-01-15

west highland white terrier 38.1
shopping cart 33.2
terrier 32
landscape 31.2
sky 28.2
handcart 26.6
hunting dog 24.2
tree 22.1
snow 20.9
water 20.7
wheeled vehicle 20
outdoor 17.6
beach 17.5
dog 17.3
weather 15.8
travel 14.8
scenery 14.4
sunset 14.4
environment 14
old 13.9
scene 13.8
trees 13.3
sea 13.3
cloud 12.9
container 12.5
rural 12.3
forest 12.2
field 11.7
radio telescope 11.6
cold 11.2
countryside 11
architecture 10.9
ocean 10.9
park 10.7
light 10.7
river 10.7
sun 10.5
grass 10.3
winter 10.2
structure 10.2
clouds 10.1
lake 10.1
city 10
silhouette 9.9
horizon 9.9
ice 9.9
scenic 9.7
urban 9.6
astronomical telescope 9.3
ecology 9.2
outdoors 9
summer 9
country 8.8
domestic animal 8.8
sand 8.8
canine 8.7
season 8.6
land 8.5
building 8.2
tourism 8.2
tower 8.1
seasonal 7.9
spring 7.8
destruction 7.8
fog 7.7
construction 7.7
outside 7.7
industry 7.7
climate 7.6
dark 7.5
sunrise 7.5
hill 7.5
evening 7.5
plant 7.4
tranquil 7.2
coast 7.2
shore 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 98.7
text 91.2
sky 82.3
black and white 80.9
old 45.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 89.3%
Happy 54.5%
Calm 32.9%
Sad 2.5%
Fear 2.4%
Confused 2.1%
Disgusted 2.1%
Surprised 1.9%
Angry 1.6%

AWS Rekognition

Age 27-37
Gender Male, 57.4%
Happy 64.8%
Calm 32%
Fear 0.8%
Surprised 0.7%
Confused 0.5%
Disgusted 0.4%
Sad 0.4%
Angry 0.3%

AWS Rekognition

Age 19-27
Gender Male, 96.7%
Calm 98.5%
Sad 1%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%
Surprised 0%
Confused 0%
Angry 0%

Feature analysis

Amazon

Person 99.4%
Bench 60.7%

Categories

Imagga

nature landscape 84.8%
beaches seaside 13.2%

Captions

Text analysis

Amazon

K44542

Google

カカ
MJI=てカSカカ
MJI
=
S