Human Generated Data

Title

Untitled (doorman)

Date

1976

People

Artist: Sage Sohier, American born 1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1157

Copyright

© Sage Sohier

Human Generated Data

Title

Untitled (doorman)

People

Artist: Sage Sohier, American born 1954

Date

1976

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1157

Copyright

© Sage Sohier

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 100
Apparel 100
Cloak 96.5
Fashion 96.5
Person 95.1
Human 95.1
Cape 77.1
Poncho 75.6
Person 72.7
Coat 68.1
Door 64.8
Person 50.6

Clarifai
created on 2023-10-25

umbrella 99.8
people 99.6
monochrome 99.3
rain 98.4
vintage 97.9
portrait 97
art 96.6
rainy 96.5
collage 96.3
one 96.1
girl 95.2
black and white 94.1
wear 94
woman 93.9
adult 92.8
doorway 92.6
door 92.6
desktop 92.2
model 91.4
fall 90.5

Imagga
created on 2022-01-09

sliding door 71.7
door 60.1
movable barrier 44.9
office 30.9
interior 30.1
barrier 29.7
chair 28.2
room 27.7
business 26.7
modern 25.2
furniture 24.5
computer 20.5
working 20.3
work 18
table 17.8
light 17.4
people 17.3
home 16.7
window 16.7
architecture 16.4
piano 16.2
indoors 15.8
television 15.8
laptop 15.8
urban 15.7
obstruction 15
desk 14.6
house 14.2
upright 13.6
man 13.4
glass 13.2
floor 13
percussion instrument 13
keyboard instrument 13
inside 12.9
seat 12.7
technology 12.6
building 12.5
city 12.5
stringed instrument 12.3
empty 12
businesswoman 11.8
transportation 11.6
design 11.2
men 11.2
women 11.1
professional 11
indoor 11
person 10.9
equipment 10.8
telephone 10.8
businessman 10.6
success 10.5
musical instrument 10.4
occupation 10.1
gate 10
wood 10
center 9.8
worker 9.8
corporate 9.4
sitting 9.4
communication 9.2
travel 9.2
silhouette 9.1
telecommunication system 9.1
suit 9
job 8.8
station 8.7
wall 8.5
manager 8.4
transport 8.2
call 8.2
furnishing 8
lobby 7.9
corridor 7.9
chairs 7.8
male 7.8
waiting 7.7
device 7.7
support 7.6
career 7.6
structure 7.5
alone 7.3
confident 7.3
kitchen 7.2
lifestyle 7.2
monitor 7.2
looking 7.2
information 7.1
decor 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

black and white 98.2
text 93.1
monochrome 89.1
clothing 88.7
street 79.3
person 71.4
white 66.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 33-41
Gender Male, 99.9%
Calm 46.6%
Surprised 31.2%
Confused 13.1%
Fear 5.4%
Sad 2%
Disgusted 0.7%
Angry 0.7%
Happy 0.3%

Microsoft Cognitive Services

Age 34
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.1%
Door 64.8%

Captions

Microsoft
created on 2022-01-09

an old photo of a person 56.1%
an old photo of a person 52.4%
old photo of a person 45.4%