Human Generated Data

Title

Untitled (acrobats on trapeze/audience watching circus)

Date

1966

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11856

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (acrobats on trapeze/audience watching circus)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1966

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 97.1
Person 96.4
Person 95.7
Person 94.8
Collage 94.6
Advertisement 94.6
Person 80.5
Shorts 77.1
Clothing 77.1
Apparel 77.1
Person 75.2
Acrobatic 68.8
People 67
Suit 66.5
Coat 66.5
Overcoat 66.5
Poster 59.8
Chair 58.9
Furniture 58.9
Girl 57.8
Female 57.8
Sport 57.4
Sports 57.4

Imagga
created on 2022-01-15

device 25.3
architecture 24
building 22
structure 19.9
sky 18.5
construction 16.3
steel 16.1
metal 16.1
city 15.8
ride 14.5
park 14.2
urban 14
equipment 13.8
industry 13.7
ferris wheel 13
support 12.8
business 12.1
new 12.1
modern 11.9
high 11.3
travel 11.3
industrial 10.9
glass 10.9
light 10.7
mechanism 10.3
bridge 10.2
design 10.1
work 9.5
exterior 9.2
rotating mechanism 9.2
skyscraper 9
technology 8.9
wall 8.7
ventilator 8.6
wheel 8.5
window 8.5
mechanical device 8.2
tower 8.1
night 8
ferris 7.9
old 7.7
skyline 7.6
office 7.6
cable 7.6
energy 7.6
power 7.6
frame 7.5
step 7.2
transportation 7.2
tool 7.2

Google
created on 2022-01-15

Photograph 94.2
White 92.2
Black 89.9
Black-and-white 87
Style 84.1
Line 81.7
People 78.8
Monochrome photography 74.6
Snapshot 74.3
Monochrome 74.1
Art 73.3
Event 69.5
Building 66.1
Naval architecture 65.6
Stock photography 64.6
Metal 63.8
Hat 61.4
Crew 60.9
Rope 55.4
Tall ship 55.3

Microsoft
created on 2022-01-15

text 96
person 93.3
black and white 91.8
ship 78.5
people 55.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Male, 96.2%
Calm 98.9%
Happy 0.7%
Angry 0.1%
Sad 0.1%
Confused 0.1%
Disgusted 0%
Surprised 0%
Fear 0%

Feature analysis

Amazon

Person 96.4%
Poster 59.8%

Captions

Microsoft

a group of people on a boat 57.2%
a group of people riding on the back of a boat 36.6%
a group of people in a boat 36.5%

Text analysis

Amazon

55
55 157
157
55156.
B
RAGUZ
ЛАООЛ

Google

55
55156.
55 057 55156.
057