Human Generated Data

Title

Untitled (aerial view of workers marching in strike on city street)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15582

Human Generated Data

Title

Untitled (aerial view of workers marching in strike on city street)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 92.3
Person 91.1
Vehicle 81.3
Car 81.3
Transportation 81.3
Automobile 81.3
Machine 81.1
Wheel 81.1
Crowd 79.3
Wheel 77.8
Person 76.4
Person 75.2
Indoors 71.4
People 67
Electronics 64.8
Display 64.8
Screen 64.8
Monitor 64.8
Person 63.9
Person 60.6
Person 57.5
Room 57.5
Sailor Suit 57
Table 56.6
Furniture 56.6
Person 44.6

Imagga
created on 2022-02-05

equipment 59.6
electronic equipment 47.6
radio receiver 37.3
technology 32.6
receiver 32.6
radio 31.6
digital 22.7
set 21.7
computer 21.7
board 20.3
network 19.5
data 19.2
electronics 19
industry 18.8
close 18.2
electrical 18.2
sound 16.8
hardware 16.3
connection 15.5
music 15.3
card 15.3
audio 15.3
system 15.2
cassette 15.2
device 15.1
information 15
sequencer 15
electronic 14.9
apparatus 14.6
electric 14
motherboard 13.8
equalizer 13.3
retro 13.1
closeup 12.8
business 12.7
component 12.7
black 12.6
communication 12.6
control 12.4
communication system 12.2
old 11.8
science 11.6
electricity 11.3
loudspeaker 11.3
chip 11.1
processor 10.8
circuit 10.8
hand 10.6
tech 10.4
part 10.1
power 10.1
vintage 9.9
mixer 9.7
cable 9.5
media 9.5
server 9.1
design 9
blackboard 8.9
monitor 8.9
amplifier 8.8
switch 8.8
button 8.8
broadcasting 8.8
volume 8.8
speaker 8.7
engineering 8.6
container 8.4
room 8.2
object 8.1
metal 8
detail 8
memory 8
tape player 7.8
panel 7.7
cassette tape 7.7
communications 7.7
recorder 7.6
speed 7.3
global 7.3
office 7.2
modern 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 96.4
black and white 92.7
black 66.2
white 62.5
ship 57

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 93.7%
Happy 83.3%
Sad 4.9%
Fear 4.6%
Confused 2.5%
Angry 1.7%
Disgusted 1.6%
Surprised 1%
Calm 0.4%

Feature analysis

Amazon

Person 91.1%
Car 81.3%
Wheel 81.1%

Captions

Microsoft

a group of people posing for a photo 64%
a group of people standing in a room 63.9%
a group of people in a room 63.8%

Text analysis

Amazon

SERKES
BURG
GIN
GIN BURG & SERKES
KREY
&

Google

BURG&
SERKES
KREY
GIN BURG& SERKES KREY
GIN