Human Generated Data

Title

Untitled (office of Dr. Herman M. Juergens, Belle Plaine, MN)

Date

1964-67

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.186

Human Generated Data

Title

Untitled (office of Dr. Herman M. Juergens, Belle Plaine, MN)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1964-67

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.186

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Person 97.4
Person 96.9
Person 96.6
Person 96.6
Person 96.5
Person 96.5
Person 96.3
Person 95.9
Person 95.5
Person 95.3
Person 95.1
Person 95.1
Person 94.8
Person 94.7
Baby 94.7
Person 94.6
Person 94.4
Person 94.4
Person 93.9
Person 93.6
Person 93.5
Person 93.1
Person 92.3
Person 92.1
Person 91.9
Person 91.6
Person 91.5
Baby 91.5
Person 90.9
Person 90
Person 89.6
Person 89.4
Person 89.2
Person 89.2
Person 88.2
Person 87.1
Person 87
Person 83.3
Person 82.4
Computer Hardware 81.5
Electronics 81.5
Hardware 81.5
Monitor 81.5
Screen 81.5
Person 79.1
Art 78
Collage 78
Face 77.9
Head 77.9
Person 77.7
Person 76
Person 74.9
Person 73.4
Person 73
Person 68.8
Person 66.6
Person 66
Person 63.6
Person 60.7
Photographic Film 58.5
Photo Booth 56.9
Indoors 55.7

Clarifai
created on 2019-02-18

people 99.9
many 99.6
group 99.6
shelf 99.6
no person 99.4
stock 98.1
adult 98.1
one 98
indoors 97.7
portrait 96.9
man 96.8
several 95.8
group together 93.8
vehicle 92.7
container 92.4
two 92
four 90.7
woman 90.2
room 89.7
commerce 89.7

Imagga
created on 2019-02-18

case 100
furniture 16.1
design 15.7
modern 14.7
equipment 14.2
technology 14.1
interior 13.2
kitchen 12.7
food 11.8
wood 11.6
business 11.5
close 11.4
cabinet 11.2
shop 10.8
shelf 10.2
power 10.1
industry 9.4
steel 8.8
computer 8.8
cooking 8.7
counter 8.7
architecture 8.6
restaurant 8.5
electric 8.4
stove 8.4
old 8.3
room 8.3
vintage 8.3
retro 8.2
structure 8
light 8
decor 7.9
objects 7.8
switch 7.8
control 7.6
bottle 7.5
house 7.5
machine 7.5
traditional 7.5
network 7.4
fruit 7.3
data 7.3
home 7.2

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 19-27
Gender Male, 98.2%
Calm 64.5%
Disgusted 22.3%
Surprised 7.5%
Fear 6.6%
Happy 3.4%
Sad 2.8%
Angry 2.6%
Confused 1.3%

AWS Rekognition

Age 25-35
Gender Male, 100%
Disgusted 60%
Calm 27.8%
Surprised 7.3%
Fear 6.1%
Happy 3.9%
Angry 2.7%
Sad 2.5%
Confused 2%

AWS Rekognition

Age 33-41
Gender Male, 100%
Calm 89.2%
Surprised 6.4%
Disgusted 6.3%
Fear 6%
Happy 2.9%
Sad 2.2%
Confused 0.4%
Angry 0.3%

AWS Rekognition

Age 19-27
Gender Male, 89.9%
Calm 76.5%
Surprised 6.9%
Disgusted 6.6%
Sad 6.2%
Fear 6.2%
Confused 4.3%
Happy 1.6%
Angry 1.2%

AWS Rekognition

Age 31-41
Gender Male, 99.5%
Surprised 57.1%
Confused 50.7%
Calm 9.8%
Fear 6.5%
Sad 2.3%
Disgusted 2%
Angry 0.4%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 64.4%
Calm 88.8%
Surprised 6.4%
Fear 6%
Sad 3.8%
Confused 2.7%
Disgusted 2.4%
Happy 0.9%
Angry 0.4%

AWS Rekognition

Age 40-48
Gender Male, 98.8%
Calm 89.1%
Surprised 6.4%
Confused 6%
Fear 5.9%
Sad 2.3%
Disgusted 2.1%
Angry 1.1%
Happy 0.7%

AWS Rekognition

Age 23-31
Gender Female, 68%
Calm 95.7%
Surprised 6.5%
Fear 6.2%
Sad 2.5%
Disgusted 0.7%
Angry 0.6%
Confused 0.3%
Happy 0.2%

AWS Rekognition

Age 21-29
Gender Male, 98.9%
Calm 57.9%
Confused 12.3%
Disgusted 10.8%
Sad 8.7%
Surprised 7.2%
Fear 6.9%
Angry 2.8%
Happy 1.6%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Sad 99.9%
Calm 16.7%
Surprised 6.5%
Disgusted 6.2%
Fear 6.1%
Angry 1.3%
Happy 0.9%
Confused 0.7%

AWS Rekognition

Age 10-18
Gender Male, 91.1%
Calm 86.1%
Surprised 6.8%
Fear 6.3%
Sad 4.6%
Happy 2.7%
Angry 1.8%
Disgusted 0.9%
Confused 0.8%

AWS Rekognition

Age 21-29
Gender Female, 63.3%
Calm 74%
Fear 12.5%
Surprised 6.7%
Sad 4.1%
Angry 3.8%
Happy 2.8%
Disgusted 1%
Confused 0.5%

AWS Rekognition

Age 16-24
Gender Male, 87.9%
Calm 84.2%
Surprised 7.5%
Fear 6.2%
Sad 4.3%
Angry 2.7%
Happy 1.9%
Confused 1.6%
Disgusted 1.4%

AWS Rekognition

Age 11-19
Gender Male, 97.5%
Calm 60.7%
Sad 12.1%
Fear 8.7%
Surprised 8.1%
Happy 7.8%
Confused 4.7%
Disgusted 2.9%
Angry 1.5%

AWS Rekognition

Age 6-16
Gender Male, 70.9%
Calm 80.5%
Happy 8.2%
Surprised 6.7%
Fear 6.2%
Angry 4.4%
Sad 3.5%
Confused 1.2%
Disgusted 0.5%

AWS Rekognition

Age 23-31
Gender Male, 98.7%
Calm 58.7%
Sad 8.5%
Disgusted 8.5%
Happy 8%
Surprised 7.3%
Fear 7%
Confused 5.4%
Angry 4.6%

AWS Rekognition

Age 21-29
Gender Male, 95.5%
Calm 94.1%
Surprised 6.4%
Fear 6%
Sad 2.7%
Happy 1.6%
Angry 1.1%
Confused 0.8%
Disgusted 0.3%

AWS Rekognition

Age 34-42
Gender Male, 100%
Disgusted 77%
Calm 19.3%
Surprised 6.6%
Fear 6%
Sad 2.3%
Confused 1.2%
Angry 0.6%
Happy 0.3%

AWS Rekognition

Age 13-21
Gender Female, 87.5%
Calm 89.3%
Fear 6.9%
Surprised 6.5%
Sad 3.8%
Happy 1.3%
Angry 1.2%
Disgusted 0.7%
Confused 0.3%

AWS Rekognition

Age 20-28
Gender Male, 95.3%
Calm 42.4%
Happy 37.2%
Surprised 7.6%
Fear 7.3%
Angry 6.7%
Sad 3.7%
Confused 1.9%
Disgusted 1.8%

Feature analysis

Amazon

Person 97.4%
Baby 94.7%
Monitor 81.5%

Categories

Imagga

interior objects 52.1%
text visuals 43.3%
paintings art 3.4%

Captions

Microsoft
created on 2019-02-18

a glass display case 50.9%
a group of glass bottles 30.3%

Text analysis

Amazon

22A
20
28A
22
25A
>23A
6A
23
16A
10A
-29A
20A
-30A
28
->30
-10
13
>25
ЗА
-BA
-21A
EL
E7A
-22A
-12A
->20
VAC
->27
V6L
VOLE
on
ROGAL
DE
BI -
SI
26A
DA
24

Google

→3A →9 10A →12A →13 →13A 之20 →21A →22A →23 →23A → 24 326A → 27 226 → 28A → 29 29 →30
3A
9
10A
12A
13
13A
20
21A
22A
23
23A
24
326A
27
226
28A
29
30