Human Generated Data

Title

Untitled (men giving speech with machinery in background)

Date

1963

People

Artist: Jack Rodden Studio, American 1914 - 2016

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13756

Human Generated Data

Title

Untitled (men giving speech with machinery in background)

People

Artist: Jack Rodden Studio, American 1914 - 2016

Date

1963

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13756

Machine Generated Data

Tags

Amazon
created on 2022-02-04

Person 98.7
Human 98.7
Person 96.8
Person 96.7
Person 95.5
Person 92.7
Musician 90.7
Musical Instrument 90.7
Poster 89.5
Advertisement 89.5
Person 81.3
Person 75.3
Crowd 73.4
Person 71.9
Meal 69.4
Food 69.4
Building 65.7
Smoke 63.7
People 61.6
Music Band 58.9
Leisure Activities 57.1
Guitarist 55.3
Guitar 55.3
Performer 55.3

Clarifai
created on 2023-10-29

people 99.8
group 97.9
adult 97.2
group together 97.1
man 96.4
woman 95.1
many 94.3
music 93
monochrome 89.3
child 87.5
wear 84.9
war 84.5
musician 84.2
ceremony 83.8
vehicle 83.8
military 82.6
art 82.1
audience 81
one 80.8
portrait 80.2

Imagga
created on 2022-02-04

stage 39.1
platform 29.5
sky 19.8
building 18.2
industrial 18.2
industry 17.9
park 15.6
architecture 14.8
structure 14.7
factory 14.5
lamp 14.3
water tower 14.2
city 14.1
old 13.2
tank 13.1
equipment 12.7
technology 12.6
power 12.6
pollution 12.5
smoke 12.1
tower 11.6
black 11.4
water 11.3
reservoir 10.7
travel 10.6
landscape 10.4
construction 10.3
tract 10.2
spotlight 10.1
dark 10
night 9.8
ancient 9.5
symbol 9.4
device 9.4
house 9.2
global 9.1
vessel 9.1
silhouette 9.1
dirty 9
metal 8.9
destruction 8.8
man 8.7
chemical 8.7
cloud 8.6
outdoor 8.4
percussion instrument 8.4
sign 8.3
environment 8.2
musical instrument 8.2
danger 8.2
light 8
urban 7.9
modern 7.7
clouds 7.6
energy 7.6
outdoors 7.5
air 7.4
music 7.2
sunset 7.2
balloon 7.2
machine 7.1
steel 7.1

Google
created on 2022-02-04

Microsoft
created on 2022-02-04

text 99.9
black and white 89.2
concert 88
black 71.2
posing 44.6
old 42.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 7-17
Gender Female, 87.6%
Fear 77.6%
Calm 18.5%
Sad 1.7%
Disgusted 0.5%
Happy 0.5%
Surprised 0.5%
Angry 0.4%
Confused 0.3%

Feature analysis

Amazon

Person
Poster
Person 98.7%
Person 96.8%
Person 96.7%
Person 95.5%
Person 92.7%
Person 81.3%
Person 75.3%
Person 71.9%
Poster 89.5%

Text analysis

Amazon

FILM
KODAK SAFETY FILM
KODAK
SAFETY
11
ARMY

Google

KODAK SAFETY FILM 11
KODAK
SAFETY
FILM
11