Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

Date

1932-1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Human Generated Data

Title

Untitled (Fourteenth Street, New York City)

People

Artist: Ben Shahn, American 1898 - 1969

Date

1932-1934

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.6
Person 99.6
Person 99.1
Person 97.7
Person 97.5
Person 96.6
Person 96.3
Person 96.2
Person 92.5
Accordion 88.6
Musical Instrument 88.6
Person 70.8
Overcoat 61.4
Coat 61.4
Clothing 61.4
Apparel 61.4
Suit 61.4

Clarifai

people 99.8
group 98.3
adult 98.1
many 97
group together 95.9
administration 94
music 92.4
man 92.2
several 90.7
leader 89.8
war 89.2
military 88.9
street 87.8
woman 87.7
instrument 86.7
musician 86.4
wear 85.5
outfit 83.5
two 82.7
vehicle 80.6

Imagga

keyboard instrument 100
accordion 100
musical instrument 100
wind instrument 100
man 28.9
music 28
male 23.4
person 21.8
people 20.6
instrument 20.4
musical 20.1
adult 19.4
playing 18.2
musician 17.5
play 16.4
business 15.8
piano 14.7
hand 14.4
performance 14.3
portrait 14.2
sound 14
face 13.5
black 13.2
professional 12.6
entertainment 12
businessman 11.5
keyboard 11.3
men 11.2
old 11.1
holding 10.7
concert 10.7
artist 10.6
attractive 10.5
player 10.4
corporate 10.3
band 9.7
hair 9.5
culture 9.4
happy 9.4
one 8.9
building 8.7
weapon 8.7
rock 8.7
art 8.5
key 8.4
businesswoman 8.2
group 8.1
looking 8
smiling 7.9
chord 7.9
performing 7.9
work 7.8
melody 7.8
performer 7.8
education 7.8
war 7.7
classical 7.6
fashion 7.5
traditional 7.5
fun 7.5
vintage 7.4
classic 7.4
retro 7.4
guy 7.3
office 7.2
suit 7.2
job 7.1

Microsoft

person 100
outdoor 97.5
accordion 97.1
music 90.6
people 79.5
standing 76.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-38
Gender Male, 54.7%
Surprised 45.2%
Disgusted 45.1%
Calm 54%
Confused 45.2%
Angry 45.2%
Happy 45.2%
Sad 45.2%

AWS Rekognition

Age 35-52
Gender Male, 51.3%
Calm 48.9%
Confused 45.6%
Surprised 45.7%
Disgusted 46.4%
Happy 45.4%
Angry 47.3%
Sad 45.7%

AWS Rekognition

Age 48-68
Gender Male, 78.7%
Happy 0.5%
Confused 0.3%
Sad 2.6%
Angry 0.5%
Calm 95.6%
Disgusted 0.2%
Surprised 0.3%

AWS Rekognition

Age 57-77
Gender Male, 97.1%
Surprised 1.9%
Happy 1.1%
Disgusted 18.3%
Calm 9.5%
Sad 24.8%
Angry 35.1%
Confused 9.3%

AWS Rekognition

Age 48-68
Gender Male, 52.2%
Sad 45.1%
Calm 54.6%
Confused 45.1%
Angry 45%
Happy 45.1%
Disgusted 45.1%
Surprised 45.1%

AWS Rekognition

Age 23-38
Gender Male, 54.4%
Confused 45.2%
Angry 45.4%
Surprised 45.2%
Calm 45.4%
Happy 45.1%
Disgusted 45.2%
Sad 53.5%

AWS Rekognition

Age 35-52
Gender Male, 50.1%
Disgusted 45.9%
Happy 45.4%
Surprised 45.6%
Sad 46%
Calm 50.3%
Angry 46.2%
Confused 45.6%

AWS Rekognition

Age 35-52
Gender Male, 51.5%
Sad 45.5%
Disgusted 45.6%
Surprised 45.2%
Calm 52.8%
Angry 45.4%
Happy 45.2%
Confused 45.2%

AWS Rekognition

Age 26-43
Gender Female, 53.2%
Confused 45.2%
Sad 45.3%
Calm 47.2%
Disgusted 50.2%
Happy 45.5%
Surprised 45.7%
Angry 45.9%

AWS Rekognition

Age 14-25
Gender Male, 54.7%
Sad 45.1%
Calm 54.2%
Surprised 45.1%
Angry 45.1%
Disgusted 45.2%
Happy 45.2%
Confused 45.1%

AWS Rekognition

Age 20-38
Gender Male, 53.7%
Angry 46.1%
Disgusted 47.6%
Happy 45.1%
Sad 46.4%
Calm 49.2%
Confused 45.4%
Surprised 45.1%

AWS Rekognition

Age 27-44
Gender Male, 53.8%
Happy 45.1%
Disgusted 48.2%
Calm 47%
Surprised 45.4%
Sad 46.7%
Confused 45.6%
Angry 47%

AWS Rekognition

Age 38-59
Gender Male, 52.5%
Sad 46.2%
Calm 53%
Angry 45.5%
Surprised 45.1%
Confused 45.1%
Happy 45.1%
Disgusted 45%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Disgusted 49.5%
Angry 49.5%
Calm 49.9%
Sad 50%
Happy 49.5%
Confused 49.5%
Surprised 49.5%

Microsoft Cognitive Services

Age 56
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people holding a sign 81.8%
a group of people standing next to a man holding a sign 75.4%
a group of people standing in the street 75.3%

Text analysis

Amazon

DRUGS
Ligoettes
CANDl
4NDY

Google

DRUGS CAND ANDY
DRUGS
CAND
ANDY