Human Generated Data

Title

Untitled (woman on chair in garden)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19515

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman on chair in garden)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19515

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 98.6
Human 98.6
Nature 86.4
Outdoors 76.8
Clothing 73.4
Apparel 73.4
Sitting 71.3
People 63.9
Tree 63.9
Plant 63.9
Ground 60.6
Photography 60.4
Photo 60.4
Person 42.2

Clarifai
created on 2023-10-22

people 99.8
one 99.4
portrait 98.3
sit 98.1
adult 98
woman 97.6
monochrome 96
man 95.2
art 93.5
veil 93.3
elderly 91.5
sitting 87.4
wear 86.8
dark 86.6
chair 86
seat 82.7
lid 82.7
black and white 82
child 81.4
dress 80.9

Imagga
created on 2022-03-05

man 30.9
covering 28.9
cloak 27.5
male 23
sport 18.9
person 18.5
adult 18.2
people 17.3
mask 16.7
gun 16.2
mountain 15.7
soldier 14.7
landscape 14.1
outdoor 13.8
military 13.5
hiking 13.5
rock 13
weapon 12.9
clothing 12.4
adventure 12.3
backpack 12
outdoors 11.8
active 11.7
silhouette 11.6
protection 10.9
camouflage 10.8
travel 10.6
rifle 10.5
stone 10.4
walking 10.4
uniform 10.2
black 9.8
darkness 9.8
forest 9.6
water 9.3
dark 9.2
danger 9.1
sky 8.9
wall 8.8
hike 8.8
sitting 8.6
summer 8.4
park 8.3
safety 8.3
alone 8.2
light 8
hiker 7.9
solitude 7.7
outside 7.7
old 7.7
walk 7.6
tourist 7.6
one 7.5
action 7.4
mountains 7.4
sports 7.4
sun 7.2
sunset 7.2
religion 7.2
scenic 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

outdoor 99
text 97.7
black and white 94
clothing 91.9
person 91.6
monochrome 83.3
man 63
megalith 19.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-36
Gender Male, 82.1%
Calm 59.1%
Sad 8.8%
Surprised 8.3%
Fear 8%
Disgusted 7.3%
Confused 5.6%
Angry 1.8%
Happy 1.1%

Feature analysis

Amazon

Person
Person 98.6%
Person 42.2%

Categories

Captions

Text analysis

Amazon

83
2218 83
2218
ابج
# d.

Google

EB VEEIA BVCL
EB
VEEIA
BVCL