Human Generated Data

Title

Untitled (family looking at pool at recereation center north of Tampa)

Date

c. 1970

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11581

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family looking at pool at recereation center north of Tampa)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 99.9
Person 99.9
Person 99.4
Person 96.2
Plant 94.4
Person 93.7
Tree 93.7
Outdoors 93.4
Countryside 93.4
Shelter 93.4
Nature 93.4
Rural 93.4
Building 93.4
Person 93.1
Clothing 92.9
Apparel 92.9
Yard 92.2
Grass 91.3
Shorts 89.2
Person 88.9
Person 84
Female 81.2
Face 75.4
Vegetation 75.2
People 74.3
Person 72.2
Palm Tree 70.6
Arecaceae 70.6
Vehicle 68.3
Airplane 68.3
Transportation 68.3
Aircraft 68.3
Person 65.2
Woman 63.4
Person 62.2
Furniture 59.9
Leisure Activities 59.8
Crowd 59.6
Housing 58.6
House 58.6
Villa 58.6
Person 52.5

Imagga
created on 2022-01-15

radio telescope 100
astronomical telescope 90.9
telescope 68
magnifier 45.1
sky 30.7
scientific instrument 22.7
beach 21.6
cloud 15.5
sea 14.8
parasol 13.8
sand 13.6
summer 13.5
water 12.7
travel 12.7
landscape 12.6
ocean 12.4
tree 12.3
structure 12
sport 12
technology 11.9
clouds 11.8
field 11.7
line 11.5
sun 11.3
high 10.4
building 10.4
architecture 10.1
energy 10.1
people 10
outdoor 9.9
rope 9.9
vacation 9.8
adult 9.7
sunny 9.5
man 9.4
shore 9.4
power 9.2
freedom 9.1
black 9
coast 9
outdoors 9
construction 8.5
tropical 8.5
electricity 8.5
palm 8.4
color 8.3
city 8.3
active 8.1
grass 7.9
urban 7.9
day 7.8
industry 7.7
old 7.7
winter 7.7
happy 7.5
silhouette 7.4
person 7.4
ecology 7.3
lifestyle 7.2
mountain 7.1
male 7.1
equipment 7.1
scenic 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 97.8
text 97.6
black and white 85.2
sky 83.1
person 82.8
clothing 68.7

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 87.1%
Calm 81%
Sad 11.8%
Confused 3.3%
Angry 1.2%
Disgusted 1%
Happy 0.8%
Surprised 0.6%
Fear 0.4%

AWS Rekognition

Age 16-24
Gender Female, 83.5%
Calm 85.3%
Sad 8.2%
Surprised 2.9%
Happy 1.1%
Fear 1%
Confused 0.6%
Disgusted 0.5%
Angry 0.4%

Feature analysis

Amazon

Person 99.9%
Airplane 68.3%

Captions

Microsoft

a vintage photo of some people that are standing in the grass 70.6%
a vintage photo of a person 70.5%
a vintage photo of a person 70.4%

Text analysis

Amazon

50449.