Human Generated Data

Title

Untitled (women using cosmetics, Suffolk, Virginia)

Date

c. 1931, printed later

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.984

Human Generated Data

Title

Untitled (women using cosmetics, Suffolk, Virginia)

People

Artist: Hamblin Studio, American active 1930s

Date

c. 1931, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.984

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.4
Human 99.4
Person 99.3
Person 98.2
Person 98
Person 97.7
Monitor 97.7
Electronics 97.7
Screen 97.7
Display 97.7
Person 97.4
Person 95.1
Military 93.9
Military Uniform 93.6
Person 91.8
Armored 84.6
Army 84.6
People 82.3
Person 81.8
Person 79.1
Person 71.5
Person 69.8
Soldier 68.1
Officer 67.6
Person 62.2

Clarifai
created on 2023-10-15

people 99.9
group 99.2
adult 98.3
monochrome 98.2
woman 96.9
man 96.7
child 95.4
furniture 93.9
group together 93.8
war 93.7
wear 93.5
indoors 93.4
movie 93.4
portrait 93.2
family 92.3
uniform 90.4
medical practitioner 90.2
room 88.9
documentary 88.7
administration 87.8

Imagga
created on 2021-12-14

television 77.8
monitor 73.9
screen 46.1
telecommunication system 42.4
computer 40.2
display 32.5
electronic equipment 31.7
office 30.5
equipment 30.2
technology 28.2
laptop 27.6
background 26.3
business 26.1
web site 24.4
keyboard 19.7
work 18.8
broadcasting 16.2
man 16.1
businessman 15.9
modern 15.4
person 14.3
communication 14.3
desk 14.2
working 14.1
people 13.9
home 12.8
telecommunication 12.3
electronic 12.1
male 12
corporate 12
sitting 12
happy 11.9
notebook 11.7
flat 11.6
smile 11.4
hand 11.4
one 11.2
design 10.7
electronic device 10.5
tech 10.4
professional 10.1
pretty 9.8
digital 9.7
looking 9.6
wireless 9.5
object 9.5
career 9.5
frame 9.2
data 9.1
global 9.1
adult 9
black 9
information 8.9
portable 8.7
smiling 8.7
desktop 8.6
network 8.6
tie 8.5
mobile 8.5
web 8.5
contemporary 8.5
presentation 8.4
executive 8.3
room 8.2
job 8
interior 8
plasma 7.8
space 7.8
media 7.6
desktop computer 7.6
medium 7.5
senior 7.5
key 7.5
close 7.4
confident 7.3
businesswoman 7.3
gray 7.2
suit 7.2
team 7.2
science 7.1
worker 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 99.8
person 98.3
clothing 97.7
monitor 95.5
man 89.3
window 85.1
posing 83.8
poster 61.5
group 61
picture frame 8.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-50
Gender Female, 84.7%
Surprised 75.1%
Calm 19.5%
Happy 2.1%
Fear 1.2%
Sad 0.7%
Confused 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 16-28
Gender Female, 86%
Calm 97.7%
Sad 1.9%
Angry 0.2%
Happy 0.1%
Disgusted 0%
Fear 0%
Confused 0%
Surprised 0%

AWS Rekognition

Age 13-25
Gender Female, 91%
Calm 53.3%
Sad 33.9%
Confused 4.2%
Fear 2.9%
Surprised 2.2%
Happy 2.2%
Angry 1%
Disgusted 0.3%

AWS Rekognition

Age 11-21
Gender Female, 88.7%
Calm 87.6%
Sad 9.6%
Angry 1.4%
Confused 0.7%
Surprised 0.3%
Fear 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 5-15
Gender Female, 81.7%
Sad 64.3%
Calm 33.9%
Angry 0.7%
Fear 0.5%
Happy 0.2%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 49-67
Gender Male, 80.2%
Calm 71.8%
Surprised 20.5%
Sad 4.1%
Fear 2.6%
Angry 0.5%
Confused 0.3%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 10-20
Gender Female, 99.7%
Calm 89.6%
Sad 8.5%
Disgusted 0.6%
Angry 0.5%
Happy 0.3%
Confused 0.2%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 46-64
Gender Male, 60.2%
Calm 90.9%
Sad 7.1%
Fear 0.9%
Surprised 0.5%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 20-32
Gender Male, 91.6%
Calm 87.3%
Sad 6.6%
Happy 3.2%
Angry 0.9%
Confused 0.6%
Surprised 0.5%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 14-26
Gender Male, 62%
Calm 92.5%
Fear 3.6%
Happy 1.3%
Surprised 1.1%
Sad 1%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 9-19
Gender Female, 64.9%
Calm 80.9%
Sad 11.7%
Angry 2.2%
Disgusted 1.7%
Surprised 1%
Fear 0.9%
Happy 0.9%
Confused 0.8%

AWS Rekognition

Age 38-56
Gender Female, 81.8%
Surprised 47.4%
Calm 33.2%
Angry 8.6%
Disgusted 3.3%
Fear 2.7%
Sad 2.2%
Happy 1.8%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.4%
Monitor 97.7%

Categories

Imagga

paintings art 96.3%
text visuals 2.5%

Text analysis

Amazon

190
47
82%
VIRGINIA
MOOKE)
Suffock, VIRGINIA
(W.E.A. MOOKE)
HAMBLIN
HAMBLIN STUDIO
STUDIO
(W.E.A.
Suffock,
14
c.1931
age

Google

THe
STUDIO
(W.E.A.
MOOK
E)
14
190
THe Stor 82% HAM BLIN STUDIO (W.E.A. MOOK E) SuffoLk, UIRGINIA C.1931 47 14 190
Stor
82%
HAM
BLIN
SuffoLk,
UIRGINIA
C.1931
47