Human Generated Data

Title

Untitled (three men at restaurant table, waitress taking order)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Human Generated Data

Title

Untitled (three men at restaurant table, waitress taking order)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon

Furniture 99.5
Chair 99.5
Person 98.9
Human 98.9
Person 94.1
Clothing 90
Apparel 90
Sitting 83.4
Musician 73.5
Musical Instrument 73.5
Electronics 65.9
Screen 65.9
Monitor 64
Display 64
Food 63.2
Meal 63.2
Table 61.6
Laptop 60.8
Computer 60.8
Pc 60.8
Leisure Activities 60.3
Text 58.9
Face 58.3
Urban 57.9
Wood 57.4
Studio 57.2
LCD Screen 56.2
Finger 55.7
Restaurant 55.3
Person 45.1

Clarifai

people 99.5
adult 96.9
man 96.7
group 96.3
indoors 95.1
woman 91.8
monochrome 89.5
music 89.4
sit 89.2
group together 88.7
room 87.1
actor 85.9
furniture 84.4
sitting 78.9
wear 77.9
musician 77.6
education 76.5
chair 72.8
concentration 69.4
leader 68.9

Imagga

man 37
people 32.4
business 29.8
male 29.1
person 28.8
musical instrument 26.2
businessman 24.7
office 24.4
computer 22.5
men 22.3
adult 21.2
work 20.4
laptop 20.3
corporate 19.8
room 18.8
job 17.7
wind instrument 17
sitting 16.3
working 15.9
indoors 15.8
group 15.3
businesspeople 15.2
modern 14.7
professional 14.6
chair 14.3
executive 14.1
mature 13.9
lifestyle 13.7
indoor 13.7
worker 13.4
equipment 13
businesswoman 12.7
employee 12.6
barbershop 12.4
smiling 12.3
looking 12
accordion 11.8
shop 11.7
handsome 11.6
classroom 11.5
black 11.4
desk 11.3
meeting 11.3
education 11.3
technology 11.1
casual 11
keyboard instrument 10.8
device 10.5
occupation 10.1
confident 10
photographer 9.9
hand 9.9
table 9.5
women 9.5
alone 9.1
suit 9.1
holding 9.1
conference 8.8
building 8.8
happy 8.8
teacher 8.6
face 8.5
communication 8.4
teamwork 8.3
music 8.2
team 8.1
hall 7.9
mercantile establishment 7.8
stringed instrument 7.8
corporation 7.7
stage 7.7
monitor 7.7
life 7.6
break 7.6
career 7.6
senior 7.5
study 7.5
manager 7.5
success 7.2
board 7.2
home 7.2
to 7.1
interior 7.1
happiness 7.1
together 7

Microsoft

text 96.7
person 89.8
clothing 86.6
black and white 76.4
man 65.6

Face analysis

Amazon

AWS Rekognition

Age 32-48
Gender Male, 52.9%
Angry 49.2%
Confused 46.1%
Happy 45.1%
Calm 46.5%
Surprised 45.5%
Sad 46.6%
Disgusted 45.4%
Fear 45.7%

AWS Rekognition

Age 43-61
Gender Male, 90.1%
Surprised 1.4%
Confused 4.4%
Fear 3.9%
Disgusted 9%
Sad 33.3%
Happy 0.4%
Calm 42.2%
Angry 5.5%

AWS Rekognition

Age 39-57
Gender Female, 66.4%
Angry 10.2%
Surprised 7.8%
Sad 10.6%
Happy 4.7%
Fear 54%
Confused 1.4%
Calm 10.2%
Disgusted 1.2%

Feature analysis

Amazon

Chair 99.5%
Person 98.9%
Laptop 60.8%

Captions

Microsoft

a group of people standing next to a window 75.7%
a group of people standing in front of a window 74.3%
a person standing next to a window 67.3%

Text analysis

Amazon

VAQOX-YI33A2
-