Human Generated Data

Title

Untitled (women using matches to light a pipe)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14637

Human Generated Data

Title

Untitled (women using matches to light a pipe)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14637

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 98.2
Human 98.2
Tub 85.6
Bathtub 84.8
Finger 62.1
Plant 60.6
Room 59
Indoors 59

Clarifai
created on 2023-10-27

people 99.8
monochrome 99.5
portrait 99.4
one 97.4
man 97.2
adult 97
actor 89.9
art 89.8
music 89.5
black and white 89.2
scientist 87.8
studio 80.8
square 78.3
mirror 77.9
science 76.9
lid 76.9
indoors 75
model 74.4
self 74.1
face 71.3

Imagga
created on 2022-01-29

man 28.9
person 25.8
people 25.1
adult 22.8
male 22
laptop 19.9
computer 18.6
portrait 17.5
business 15.8
happy 15.7
negative 15.6
technology 15.6
office 15.3
television 15.1
worker 15.1
work 14.9
professional 14.8
looking 14.4
working 13.3
senior 13.1
corporate 12.9
face 12.8
human 12.7
elderly 12.4
smiling 12.3
lifestyle 12.3
shower cap 12.2
mature 12.1
home 12
film 11.7
job 11.5
businessman 11.5
cap 11.3
old 11.1
hair 11.1
device 11
telecommunication system 10.9
smile 10.7
modern 10.5
attractive 10.5
monitor 9.8
indoors 9.7
sitting 9.4
desk 9.4
relaxation 9.2
alone 9.1
indoor 9.1
holding 9.1
photographic paper 9
lady 8.9
clothing 8.5
communication 8.4
pretty 8.4
one 8.2
gray 8.1
blackboard 8.1
women 7.9
black 7.8
expression 7.7
health 7.6
serious 7.6
bath 7.6
hand 7.6
businesspeople 7.6
headdress 7.6
one person 7.5
equipment 7.5
shirt 7.5
manager 7.4
glasses 7.4
telephone 7.4
executive 7.4
safety 7.4
electronic equipment 7.1
copy 7.1
vessel 7.1
bathtub 7

Microsoft
created on 2022-01-29

text 99.9
electronics 86.4
human face 79.9
black and white 69.7
display 60.5
posing 43.6
computer 37.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Female, 93.1%
Calm 99.8%
Surprised 0%
Happy 0%
Sad 0%
Confused 0%
Disgusted 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Person
Person 98.2%

Categories

Imagga

paintings art 95.1%
interior objects 3.2%

Text analysis

Amazon

ON
Mod
MJI7
A70A
MJI7 YESTAL A70A
и
YESTAL

Google

MJ17 YT37A 2 A7OA
MJ17
YT37A
2
A7OA