Human Generated Data

Title

Untitled (Air Force officer in front of charts and map)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16125

Human Generated Data

Title

Untitled (Air Force officer in front of charts and map)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Human 99.7
Person 99.7
Electronics 73.2
Screen 73.2
Monitor 71.6
Display 71.6
LCD Screen 69.8
Furniture 63.8
Desk 59.3
Table 59.3
Indoors 58.1
White Board 57.1
Architecture 56.8
Building 56.8
Tower 56.8
Spire 56.3
Steeple 56.3

Imagga
created on 2022-02-05

case 36.2
center 27.6
office 27
interior 22.1
monitor 22
computer 21.7
working 20.3
equipment 20
work 19.6
person 19.4
technology 18.5
table 18.5
business 18.2
education 18.2
people 17.3
furniture 17
man 16.8
desk 16.8
indoors 16.7
home 15.9
house 15.9
worker 15.2
display 15
adult 15
room 15
male 14.9
television 14.7
indoor 14.6
laptop 13.9
modern 13.3
happy 12.5
smiling 12.3
lifestyle 12.3
shop 12.2
smile 12.1
screen 12.1
occupation 11.9
school 11.7
restaurant 11.5
professional 11.2
device 10.9
communication 10.9
kitchen 10.4
chair 10.4
classroom 10.4
portrait 10.3
window 10
light 10
locker 9.7
job 9.7
counter 8.8
studying 8.6
sitting 8.6
industry 8.5
background 8.5
keyboard 8.4
student 8.1
digital 8.1
metal 8
food 8
fastener 7.8
3d 7.7
pretty 7.7
machine 7.7
college 7.6
imagination 7.6
dinner 7.6
building 7.5
wood 7.5
holding 7.4
coffee 7.4
inside 7.4
book 7.3
group 7.2
looking 7.2
science 7.1
to 7.1
steel 7.1
electronic equipment 7

Google
created on 2022-02-05

Vision care 89.4
Amber 86.7
Mirror 86.6
Eyewear 81.3
Map 67.6
Fixture 66.4
Event 65.4
Electricity 61.5
Glass 57.5
Room 57.3
Light fixture 56.1
Machine 55.9
Suit 53
Tourist attraction 50

Microsoft
created on 2022-02-05

text 98.1
wall 95.1
indoor 91.4
person 85.2

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 99.7%
Calm 80.1%
Sad 10.6%
Confused 2.7%
Angry 2.4%
Disgusted 1.8%
Surprised 1%
Happy 1%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a person standing in front of a mirror posing for the camera 71.5%
a person standing in front of a mirror posing for the camera 71.4%
a person standing in front of a mirror 71.3%

Text analysis

Amazon

CTCA
RADIO
TRAFFIC
SAC
SAC CTCA
CHANNEL
CHANNEL AND TRAFFIC
AND
SAC TELEPHONE NETWORK
SYSTEM
NETWORK
ALERTING
CONTROL
NE
CONTROL AGENCY
TELEPHONE
COMMANDERS
AGENCY
TWORK
PRIMARY
TELETYPE
SAC PRIMARY ALERTING SYSTEM
NET
SAC TELETYPE NE TWORK
SAC RADIO NETWORK -21
0000
-21
OINHOO

Google

CONTROL AGENCY SAC COMMANDERS RADIO NET SAC PRIMARY ALERTING SYSTEM SAC AND TRAFFIC SAC CICA SAC TELETYPE NETWORK SAC RADIO TELEPHONE NETWORK 21 SAC TELEPHONE NETWORK
SAC
PRIMARY
ALERTING
NETWORK
CONTROL
NET
CICA
TELETYPE
RADIO
TELEPHONE
AGENCY
SYSTEM
21
AND
COMMANDERS
TRAFFIC