Human Generated Data

Title

Untitled (man in suit at table, Dallas, Texas)

Date

c. 1930s, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.61

Human Generated Data

Title

Untitled (man in suit at table, Dallas, Texas)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1930s, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.5
Person 98.5
Indoors 82.1
Interior Design 82.1
Screen 78.1
Monitor 78.1
Electronics 78.1
Display 78.1
Furniture 62.6
Crowd 59.7

Imagga
created on 2021-12-14

television 100
broadcasting 89.5
monitor 67.1
telecommunication 65
screen 54.5
telecommunication system 52.2
display 44.8
medium 43
technology 39.3
computer 37.8
equipment 36.6
flat 32.8
electronic 30.8
video 26.1
modern 25.9
business 23.1
plasma 22.3
entertainment 21.2
digital 21.1
design 20.8
black 20.4
panel 20.3
wide 19.2
communication 18.5
blank 16.3
electronic equipment 15.8
visual 15.4
object 15.4
desktop 15.4
media 15.2
electronics 15.2
liquid 14.8
laptop 14.6
movie 14.5
home 14.4
presentation 14
information 13.3
show 13.3
crystal 13.2
office 12.9
frame 12.5
keyboard 12.2
space 11.6
3d 11.6
style 11.1
graphic 10.9
liquid crystal 10.9
futuristic 10.8
silver 10.6
network 10.2
web 10.2
gray 9.9
film 9.6
electrical 9.6
high 9.5
tech 9.5
data 9.1
studio 9.1
definition 8.8
portable 8.7
work 8.6
close 8.6
industry 8.5
finance 8.4
elegance 8.4
global 8.2
room 8.2
symbol 8.1
financial 8
interior 8
broadcast 7.9
cinema 7.8
thin 7.8
wallpaper 7.7
showing 7.5
notebook 7.5
future 7.4
single 7.4
inside 7.4
reflection 7.3
art 7.2

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 100
black 94
monitor 91.9
television 90.7
screen 76.6
cartoon 73.4
black and white 67.5
screenshot 64.7
image 36

Face analysis

Amazon

Google

AWS Rekognition

Age 44-62
Gender Male, 97.8%
Calm 85.3%
Angry 4.4%
Happy 4.4%
Sad 2.2%
Confused 1.4%
Surprised 1.4%
Disgusted 0.5%
Fear 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Monitor 78.1%

Captions

Microsoft

a flat screen television 68%
a flat screen tv sitting on top of a television 50.9%
a flat screen tv sitting in front of a television 50.8%

Text analysis

Amazon

85%
Paul
Studio
Texas
1930's
190
Paul Gittings Studio
6
Gittings
10
2
Dallas, Texas
с. 1930's
Dallas,
6 5
BALAI 2 6 5
с.
BALAI
YY3RA2
&CROP
MJIY YY3RA2 032K1
MJIY
032K1

Google

Gittings
c.
10
190
6 85% Paul Gittings Studio Da1las, Texas c. 1930's 10 190
6
Studio
Texas
1930's
85%
Paul
Da1las,