Human Generated Data

Title

Untitled (Cristiani Circus two frames: performers at back of truck, camera man and performer)

Date

1955

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11863

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Cristiani Circus two frames: performers at back of truck, camera man and performer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11863

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.1
Human 99.1
Apparel 98.9
Clothing 98.9
Person 98.3
Person 92.8
Collage 91.1
Advertisement 91.1
Person 88.1
Face 81.7
Person 66.9
Person 65.5
Female 64.7
Suit 59.9
Coat 59.9
Overcoat 59.9
Hat 55.8
Poster 53.1

Clarifai
created on 2023-10-26

monochrome 99.8
people 99.8
adult 98.9
wear 96.3
man 95.2
chair 94.3
group together 94.3
woman 92.5
nostalgia 90.6
musician 90
music 89.7
vehicle 88.4
child 87.6
administration 87.6
two 87.5
drum 86.2
actress 85.4
transportation system 84.8
three 83.3
street 83.1

Imagga
created on 2022-01-15

equipment 29.5
backboard 26.5
comic book 25.8
device 15.6
cassette tape 14.5
grunge 14.5
television 13.2
drawing 12.9
technology 12.6
old 12.5
black 12
sketch 11.9
power 11.7
magnetic tape 11.7
newspaper 11.7
design 11.2
art 10.4
print media 10.3
window 10.1
man 10.1
sport 10
building 9.6
computer 9.6
speed 9.2
business 9.1
vintage 9.1
memory device 9.1
dirty 9
retro 9
transportation 9
metal 8.8
working 8.8
background 8.4
people 8.4
network 8.3
digital 8.1
cable 8
product 7.8
travel 7.7
frame 7.7
poster 7.6
one 7.5
room 7.4
screen 7.3
modern 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99
black and white 82.2
drawing 68
old 57.1
person 50.9
posing 45.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 96.1%
Calm 73.7%
Sad 20.2%
Happy 3.6%
Confused 0.7%
Angry 0.7%
Surprised 0.5%
Disgusted 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Poster 53.1%

Categories

Imagga

paintings art 98.8%

Captions

Text analysis

Amazon

5
8
WFLA-TV 8
KODAK
WFLA-TV
FILM
N
SAFETY
44320.
44319.
N IBC
IBC

Google

WFLA-TV
319.
44320.
KODAK
SAFETY
FILM
WFLA-TV e4 319. 44320. KODAK SAFETY SAFETY FILM KODAK
e4