Human Generated Data

Title

Untitled (man and woman at customs counter, Miami International Airport)

Date

1951, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Human Generated Data

Title

Untitled (man and woman at customs counter, Miami International Airport)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1951, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.3
Human 99.3
Person 99.2
Person 99.1
Person 98.2
Apparel 92.7
Clothing 92.7
Person 91.4
Sitting 75.9
Shirt 75.4
Tie 68.2
Accessories 68.2
Accessory 68.2
Tie 65.5
Furniture 64.8
Overcoat 62.6
Coat 62.6
Sleeve 55.8
Crowd 55.7

Clarifai

people 99.8
adult 96.9
group together 96.4
group 95.8
man 94.4
typewriter 91.4
war 91
two 90.7
music 89
desk 89
five 87.7
several 87.4
woman 86.3
furniture 85.9
leader 84.9
actor 84.7
wear 84.3
vehicle 84
three 83.2
administration 83

Imagga

person 44.2
office 41.1
man 37
business 35.2
male 34
computer 32.6
scholar 32.2
people 30.7
laptop 30.5
professional 29.6
businessman 29.1
working 27.4
work 25.9
corporate 25.8
intellectual 25.7
adult 25.6
musical instrument 22.4
desk 20.5
executive 20.4
businesspeople 19.9
table 19.5
happy 19.4
sitting 18
keyboard 17.8
job 17.7
worker 17.2
suit 17
businesswoman 16.4
looking 16
percussion instrument 15.4
smiling 15.2
smile 15
indoors 14.9
manager 14.9
black 13.2
portrait 12.9
document 12.6
technology 12.6
paper 12.5
team 12.5
workplace 12.4
hands 12.2
group 12.1
men 12
successful 11.9
notebook 11.9
room 11.6
handsome 11.6
device 11.4
career 11.4
meeting 11.3
success 11.3
lifestyle 10.8
face 10.6
one 10.4
phone 10.1
indoor 10
studio 9.9
hand 9.9
attractive 9.8
boss 9.6
serious 9.5
education 9.5
classroom 9.4
glasses 9.2
employee 9.1
holding 9.1
student 9
jacket 8.8
colleagues 8.7
corporation 8.7
pen 8.7
happiness 8.6
thinking 8.5
tie 8.5
telephone 8.5
casual 8.5
modern 8.4
communication 8.4
cheerful 8.1
school 8.1
home 8
mid adult 7.7
teacher 7.7
disk jockey 7.6
book 7.5
fun 7.5
teamwork 7.4
alone 7.3
free-reed instrument 7.1

Microsoft

person 99.3
clothing 97.5
text 95.7
man 93.3
indoor 91.4
people 69
posing 56.3
old 54.5
black and white 51.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-45
Gender Male, 98.6%
Angry 3%
Disgusted 0.8%
Surprised 1%
Fear 1.1%
Confused 2.6%
Happy 1.4%
Sad 34.3%
Calm 55.8%

AWS Rekognition

Age 34-50
Gender Male, 52.3%
Happy 49.4%
Surprised 45.1%
Fear 45.2%
Calm 46.4%
Angry 45.5%
Sad 48%
Confused 45.2%
Disgusted 45.2%

AWS Rekognition

Age 13-25
Gender Female, 51.4%
Fear 45%
Confused 45%
Happy 45%
Angry 45%
Disgusted 45%
Calm 54.8%
Surprised 45.1%
Sad 45.1%

AWS Rekognition

Age 23-35
Gender Female, 50.5%
Fear 45.2%
Angry 47.1%
Happy 45.1%
Disgusted 45.1%
Confused 45.3%
Calm 50.7%
Sad 46.1%
Surprised 45.3%

Microsoft Cognitive Services

Age 37
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Tie 68.2%

Captions

Microsoft

a group of people standing around a table 96%
a group of people posing for a photo on a table 94.2%
a group of people posing for a photo 93.4%

Text analysis

Google

FRACILE
FRACILE