Human Generated Data

Title

Untitled (man outside circus tent balancing balls)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8486

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man outside circus tent balancing balls)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.5
Person 97.8
Building 93.6
Nature 93.6
Countryside 93.6
Shelter 93.6
Rural 93.6
Outdoors 93.6
Tent 93.2
Camping 91.4
Apparel 83.9
Clothing 83.9
Female 74.9
Leisure Activities 70.7
Furniture 70.7
Chair 70.7
Face 67.3
Photo 65.1
Portrait 65.1
Photography 65.1
Plant 62
Yard 60.6
People 59.8
Text 56.1

Imagga
created on 2022-01-15

canvas tent 100
travel 21.8
sky 20.4
vacation 18.8
landscape 17.8
outdoor 16.8
summer 15.4
tree 15.4
beach 14.3
tent 14
people 12.8
outdoors 12.7
recreation 12.5
holiday 12.2
tourism 11.5
park 11.5
sun 11.3
leisure 10.8
building 10.7
mountain 10.7
umbrella 10.6
forest 10.4
adventure 10.4
architecture 10.2
clouds 10.1
island 10.1
water 10
camping 9.8
vacations 9.4
man 9.4
sea 9.4
resort 9.4
sport 9.2
city 9.1
business 9.1
tourist 9.1
sunset 9
fun 9
equipment 8.8
grass 8.7
sunny 8.6
tropical 8.5
ocean 8.3
technology 8.2
structure 8.2
camp 7.9
work 7.8
industry 7.7
old 7.7
relax 7.6
relaxation 7.5
parasol 7.4
environment 7.4
calm 7.3
protection 7.3
metal 7.2
lifestyle 7.2
coast 7.2
activity 7.2
adult 7.1
day 7.1

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 99.1
text 97.6
black and white 81.5
person 68.2
tent 60.8
outdoor object 53

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 66.2%
Sad 36.2%
Confused 19.9%
Calm 15.7%
Happy 9.4%
Surprised 9%
Disgusted 4.7%
Angry 4.1%
Fear 1%

AWS Rekognition

Age 45-53
Gender Female, 61.8%
Calm 61.9%
Surprised 30.5%
Sad 2.4%
Angry 1.7%
Disgusted 1.2%
Confused 0.9%
Happy 0.8%
Fear 0.6%

AWS Rekognition

Age 37-45
Gender Male, 75.4%
Calm 97.5%
Sad 0.7%
Disgusted 0.5%
Angry 0.4%
Surprised 0.4%
Happy 0.2%
Confused 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tent 93.2%

Captions

Microsoft

a man standing in front of a tent 92.3%
a group of baseball players standing on top of a tent 53.8%
a man in a tent 53.7%

Text analysis

Amazon

a
16030.
100-NA2A3

Google

16
HACON-YT3RA2-NAMTZA3
T6030.
16 HACON-YT3RA2-NAMTZA3 T6030.