Human Generated Data

Title

Untitled (street performers singing in park)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7556

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (street performers singing in park)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7556

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Person 93.9
Person 91.8
Building 90.9
Architecture 90.9
Tree 87.4
Plant 87.4
Person 83.4
Vegetation 77.4
Column 72.3
Pillar 72.3
People 65.4
Nature 61.3
Outdoors 61.3
Land 61.3
Grove 61.3
Woodland 61.3
Forest 61.3
Pedestrian 59.4
Crowd 56.6

Clarifai
created on 2023-10-25

people 99.8
group 97
adult 96.8
child 95.2
group together 94.1
man 92.8
many 92.3
woman 91.1
monochrome 89.9
wear 85.9
tree 85
recreation 84.6
art 83.1
home 82.9
administration 82.8
boy 81.5
several 80.5
print 78.6
war 77.7
military 77

Imagga
created on 2022-01-08

fountain 70.4
structure 48.7
water 30
people 17.3
sunset 17.1
beach 15.2
silhouette 14.9
summer 14.8
lake 14.7
outdoors 14.4
travel 14.1
park 13.5
man 13.4
sea 13.3
ocean 13.3
wet 12.5
leisure 12.5
world 12.4
serene 12.3
outdoor 12.2
black 12
landscape 11.9
light 11.6
sky 11.5
reflection 11
person 10.9
recreation 10.8
peaceful 10.1
relaxation 10
river 10
coast 9.9
vacation 9.8
sport 9.7
tranquil 9.1
sexy 8.8
body 8.8
boat 8.7
adult 8.4
old 8.4
waves 8.4
life 8.3
tourism 8.2
sun 8.2
sunlight 8
newspaper 8
couple 7.8
rock 7.8
scene 7.8
model 7.8
portrait 7.8
men 7.7
relax 7.6
dark 7.5
human 7.5
city 7.5
fun 7.5
one 7.5
vintage 7.4
building 7.2
happiness 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 99.6
outdoor 99
tree 98.4
black and white 96
water 83.1
monochrome 80.8
grave 79.7
cemetery 63.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 99.1%
Calm 95.8%
Happy 2%
Sad 0.6%
Fear 0.6%
Disgusted 0.4%
Surprised 0.2%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 36-44
Gender Female, 99.3%
Happy 93.6%
Calm 1.7%
Fear 1%
Sad 1%
Disgusted 0.8%
Surprised 0.7%
Confused 0.6%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Text analysis

Amazon

287173
are

Google

287173-
287173-