Human Generated Data

Title

Untitled (man displaying machine parts)

Date

c. 1950

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1852

Human Generated Data

Title

Untitled (man displaying machine parts)

People

Artist: John Deusing, American active 1940s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1852

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.5
Human 99.5
Glasses 81.9
Accessories 81.9
Accessory 81.9
Face 71.3
Portrait 67.5
Photography 67.5
Photo 67.5
Furniture 65.4
Table 62
Clinic 59
Teeth 56
Mouth 56
Lip 56

Clarifai
created on 2023-10-15

people 99.7
adult 98.8
one 98
science 97.5
monochrome 96.5
scientist 96.3
man 95.6
two 93.5
indoors 90.7
industry 88
medical practitioner 87.9
military 87.9
war 87.1
wireless communication 85.1
physics 84.7
biologist 82
technology 80.6
wear 79
exploration 77.6
instrument 72.8

Imagga
created on 2021-12-14

equipment 53.4
electronic equipment 50.7
tape player 48.7
disk jockey 45.4
broadcaster 36.3
person 34.6
communicator 27.2
man 26.9
technology 20.8
work 20.4
computer 20.1
people 20.1
male 19.8
working 18.5
business 17.6
adult 16.8
job 16.8
professional 16
device 14.6
worker 13.3
hand 12.9
office 12.8
occupation 12.8
laptop 12.7
digital 12.1
black 12
gramophone 11.5
happy 11.3
machine 11
radio 10.7
industry 10.2
industrial 10
music 9.9
handsome 9.8
interior 9.7
portrait 9.7
entertainment 9.2
record player 9.2
modern 9.1
one 9
home 8.8
mechanical 8.7
smiling 8.7
lifestyle 8.7
men 8.6
notebook 8.4
power 8.4
attractive 8.4
inside 8.3
holding 8.2
cheerful 8.1
indoors 7.9
hardware 7.7
repair 7.7
money 7.6
studio 7.6
communication 7.6
sound 7.5
action 7.4
phone 7.4
safety 7.4
hat 7.3
protection 7.3
kitchen 7.2
face 7.1
paper 7.1
engineer 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 100
person 98
black and white 82.2
drawing 73.3
clock 70.3
cartoon 68.4
clothing 67.8
human face 66.4
preparing 62.7
poster 57.6
man 55.1
cooking 31.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-46
Gender Male, 96.4%
Calm 95.4%
Surprised 4.1%
Angry 0.2%
Happy 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%
Sad 0%

Feature analysis

Amazon

Person 99.5%
Glasses 81.9%

Categories

Imagga

paintings art 98.4%

Text analysis

Amazon

20
30
40
50
10
o
x
DRIVER
WORKS
********

Google

T to 10 20 20 30 30 40 50 HADOH-YT A-1HAMT2A 3
T
to
20
30
40
HADOH-YT
A-1HAMT2A
3
10
50