Human Generated Data

Title

Untitled (office of Dr. Herman M. Juergens, Belle Plaine, MN)

Date

1964-67

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.191

Human Generated Data

Title

Untitled (office of Dr. Herman M. Juergens, Belle Plaine, MN)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1964-67

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.1.191

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Person 97.8
Person 97.8
Person 97.2
Electronics 97.2
Screen 97.2
Person 97.1
Person 96.9
Person 96.7
Person 96.5
Person 96.2
Person 96.2
Person 96
Person 95.5
Person 95.4
Person 95.3
Person 95.1
Person 94.6
Person 94.5
Person 91.5
Computer Hardware 90.5
Hardware 90.5
Person 86.1
Baby 86.1
Monitor 84.1
Person 83.9
Person 83
Person 82.9
Person 76.8
Adult 76.8
Bride 76.8
Female 76.8
Wedding 76.8
Woman 76.8
Face 74.1
Head 74.1
Person 63.7
Photographic Film 58.1
Machine 58
Photo Booth 56.4
Art 55.6
Collage 55.6
Vending Machine 55.6
TV 55.2

Clarifai
created on 2019-02-18

television 99.7
people 98.7
group 97.3
screen 96.9
movie 96.1
many 94.7
vehicle 94.3
moment 94.2
analogue 92
one 91.6
adult 91.2
three 91.1
sliding 89.9
picture frame 89.6
display 89.6
no person 88.8
man 88.4
album 88.2
portrait 88
technology 86.8

Imagga
created on 2019-02-18

monitor 43.1
case 41.3
equipment 36.5
electronic equipment 35.9
television 22.2
technology 20
screen 19.2
furniture 18.9
design 17.4
digital 16.2
display 16.2
computer 15.3
film 14.8
modern 14.7
business 14
entertainment 12.9
device 12.6
buffet 12.5
interior 12.4
home 11.9
furnishing 11.9
old 11.8
architecture 11.7
network 11.2
industry 11.1
grunge 11.1
black 10.8
office 10.2
room 10
movie 9.7
machine 9.7
video 9.7
money 9.3
web 9.3
communication 9.2
house 9.2
investment 9.2
close 9.1
bank 8.9
pattern 8.9
paper 8.6
vending machine 8.6
empty 8.6
media 8.5
glass 8.5
power 8.4
broadcasting 8.2
indoor 8.2
center 8.2
noise 7.8
art 7.8
radio 7.7
texture 7.6
control 7.6
finance 7.6
electronics 7.6
electronic 7.5
camera 7.4
set 7.4
retro 7.4
inside 7.4
banking 7.3
negative 7.3
music 7.2
amplifier 7.2
wealth 7.2
science 7.1
night 7.1
steel 7.1
indoors 7
slot machine 7

Google
created on 2019-02-18

Microsoft
created on 2019-02-18

old 87
watching 43.7
exhibition 43.7
art 17.9
museum 9.3
black and white 8.4
television 4.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-34
Gender Male, 98.8%
Angry 94.6%
Surprised 6.8%
Fear 6.4%
Sad 2.4%
Calm 1.7%
Confused 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99.9%
Happy 97.1%
Surprised 6.3%
Fear 6.1%
Sad 2.3%
Disgusted 0.8%
Calm 0.4%
Confused 0.3%
Angry 0.2%

AWS Rekognition

Age 34-42
Gender Male, 98.3%
Calm 44.5%
Happy 39.2%
Sad 7.5%
Surprised 6.5%
Fear 6%
Confused 4.5%
Angry 1%
Disgusted 0.7%

AWS Rekognition

Age 60-70
Gender Male, 99.7%
Sad 94.5%
Calm 37%
Surprised 6.9%
Fear 6.8%
Happy 5%
Disgusted 3.6%
Angry 3.2%
Confused 1%

AWS Rekognition

Age 21-29
Gender Female, 56.1%
Angry 94.1%
Surprised 6.7%
Fear 6.2%
Sad 2.6%
Disgusted 1%
Happy 0.9%
Calm 0.6%
Confused 0.3%

AWS Rekognition

Age 28-38
Gender Male, 99.7%
Surprised 46.2%
Calm 25.7%
Fear 16.7%
Disgusted 13.2%
Happy 9.6%
Sad 2.6%
Confused 1.8%
Angry 1.2%

AWS Rekognition

Age 29-39
Gender Male, 90%
Calm 87.6%
Surprised 6.5%
Fear 6%
Sad 3.6%
Confused 2.8%
Happy 2.6%
Disgusted 1.6%
Angry 0.9%

AWS Rekognition

Age 37-45
Gender Male, 98.6%
Calm 52.8%
Confused 26.4%
Surprised 8%
Angry 6.2%
Fear 6.2%
Sad 4.6%
Disgusted 4.3%
Happy 0.9%

AWS Rekognition

Age 22-30
Gender Male, 99.3%
Calm 63.2%
Sad 32.6%
Surprised 6.9%
Fear 6.4%
Happy 5.3%
Angry 3.7%
Disgusted 1.8%
Confused 1.6%

AWS Rekognition

Age 54-62
Gender Male, 99.9%
Calm 98.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.7%
Disgusted 0.3%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99.7%
Calm 93.5%
Surprised 7.5%
Fear 6.5%
Sad 2.3%
Disgusted 1.2%
Angry 0.4%
Confused 0.3%
Happy 0.2%

Feature analysis

Amazon

Person 97.8%
Baby 86.1%
Monitor 84.1%
Adult 76.8%
Bride 76.8%
Female 76.8%
Woman 76.8%

Categories

Imagga

text visuals 100%

Text analysis

Amazon

BA
12A
-15
-13
-10A
>15A
->4
77
١٤٨-
->313

Google

915
313
13A
10A
6A
10
4
915 →15A 313 13A 14 →12A 10A 6A 10 →4
15A
14
12A