Human Generated Data

Title

Untitled (Dr. Herman M. Juergens talking with patient)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (Dr. Herman M. Juergens talking with patient)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.2
Person 99.2
Person 98.6
Apparel 92.2
Clothing 92.2
Pants 79.1
Leisure Activities 75.4
Wood 75.2
Flooring 70.7
Furniture 69.1
Floor 68.3
Food 67.2
Meal 67.2
Sitting 65.8
Finger 64.5
Couch 62.1
Dance Pose 59.8
Chair 58.3
Table 58.1

Clarifai

people 99.9
adult 99.2
one 98
room 97.4
furniture 96.9
man 96.9
two 95.7
woman 94.7
wear 92
group 91.3
indoors 90.9
monochrome 90.3
group together 88.1
concentration 87.3
sit 86.3
military 85.3
war 84.3
scientist 82.4
employee 82.3
leader 82.2

Imagga

man 39.6
computer 35.6
office 32.4
person 31.3
work 30.6
people 29.6
laptop 29.5
working 29.1
business 29.1
sax 28.2
male 26.9
desk 24
job 23
worker 22.8
table 21.8
adult 21.2
room 20.8
businessman 20.3
indoors 19.3
professional 19.1
sitting 18.9
corporate 17.2
home 15.9
occupation 15.6
executive 15.4
smiling 15.2
technology 14.8
meeting 14.1
chair 13.7
restaurant 13.6
communication 13.4
shop 13.3
happy 13.2
lifestyle 13
hand 12.9
men 12.9
smile 12.8
modern 12.6
barbershop 12.5
interior 12.4
group 12.1
looking 12
wind instrument 10.5
phone 10.1
notebook 10
teacher 9.9
monitor 9.9
holding 9.9
equipment 9.8
employee 9.6
education 9.5
student 9.5
paper 9.4
keyboard 9.4
center 9.1
confident 9.1
businesswoman 9.1
handsome 8.9
success 8.8
engineer 8.8
conference 8.8
workplace 8.6
businesspeople 8.5
call 8.5
study 8.4
manager 8.4
teamwork 8.3
building 8.3
inside 8.3
board 8.2
alone 8.2
indoor 8.2
suit 8.1
team 8.1
telephone 7.9
hands 7.8
device 7.8
portrait 7.8
industry 7.7
network 7.6
reading 7.6
career 7.6
sit 7.6
mercantile establishment 7.6
house 7.5
senior 7.5
one 7.5
mature 7.4
light 7.3
hall 7.2
furniture 7.2
musical instrument 7.1
look 7

Google

Microsoft

person 95
indoor 94.4
black and white 93.7
man 92.7
piano 88.6
text 81
clothing 58.7
table 29.4

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Male, 51.8%
Confused 45%
Surprised 45%
Disgusted 45%
Fear 45%
Angry 45%
Calm 51.8%
Happy 45.1%
Sad 47.9%

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man sitting at a table 87.4%
a man sitting at a table in a kitchen 82.1%
a man sitting on a kitchen counter 68.7%