Human Generated Data

Title

Untitled (two photographs: old man posed reading in den next to bay windows with plants on ledge; double studio portrait of woman wearing dress, hat, corsage, and cross)

Date

1935-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Human Generated Data

Title

Untitled (two photographs: old man posed reading in den next to bay windows with plants on ledge; double studio portrait of woman wearing dress, hat, corsage, and cross)

People

Artist: Martin Schweig, American 20th century

Date

1935-1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.6
Person 99.6
Person 99.5
Interior Design 99.3
Indoors 99.3
Room 86.3
Person 77.2
Leisure Activities 76
Crowd 68.9
Chair 59.3
Furniture 59.3
Theater 55.5
Court 55.4

Clarifai

people 99.8
monochrome 98.6
man 98.4
woman 96.6
adult 96.3
two 96.1
group 96
street 91.8
group together 91.7
furniture 90.4
three 88.7
four 87.4
music 86.8
room 86.5
chair 85.8
sit 84.6
family 83.7
wear 83.6
indoors 83.4
portrait 82.5

Imagga

room 39.1
interior 30.9
man 26.8
classroom 24.7
chair 20.7
indoors 19.3
restaurant 19.3
people 18.4
modern 18.2
male 17.7
business 17.6
table 16.6
person 16.4
inside 15.6
office 15.5
indoor 14.6
light 12
men 12
building 11.9
furniture 11.9
work 11.8
shop 11.2
home 11.2
worker 10.9
city 10.8
window 10.6
urban 10.5
architecture 10.3
hospital 10.2
design 10.1
kitchen 10
nurse 9.8
businessman 9.7
black 9.6
standing 9.5
women 9.5
barroom 9.2
hall 9.2
holding 9.1
group 8.9
computer 8.8
equipment 8.8
chairs 8.8
barbershop 8.7
decoration 8.7
lifestyle 8.7
sitting 8.6
glass 8.5
wall 8.5
industry 8.5
style 8.1
school 8
decor 7.9
steel 7.9
working 7.9
life 7.9
desk 7.8
old 7.7
dinner 7.6
happy 7.5
floor 7.4
teamwork 7.4
occupation 7.3
food 7.3
adult 7.3
employee 7.1
family 7.1
patient 7.1
job 7.1
structure 7.1

Microsoft

text 96.9
clothing 96
man 92.7
black and white 92.6
indoor 90.1
person 89.3
human face 54.3
old 45.7

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 38-56
Gender Female, 54.7%
Disgusted 45.1%
Happy 45.1%
Angry 45.3%
Confused 45.2%
Calm 45.9%
Fear 45.3%
Sad 53%
Surprised 45%

AWS Rekognition

Age 22-34
Gender Female, 54.9%
Calm 54.1%
Angry 45.1%
Surprised 45%
Confused 45.1%
Disgusted 45.1%
Happy 45.3%
Sad 45.2%
Fear 45%

AWS Rekognition

Age 51-69
Gender Male, 50.4%
Angry 49.5%
Sad 49.9%
Happy 49.5%
Disgusted 49.5%
Calm 50%
Confused 49.6%
Surprised 49.5%
Fear 49.5%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 59.3%

Captions

Microsoft

a black and white photo of a man 91.3%
an old photo of a man 91.2%
black and white photo of a man 88.1%

Text analysis

Amazon

CVOE