Human Generated Data

Title

Untitled (five women wearing aprons posed in kitchen next to table full of plates with food)

Date

1948

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (five women wearing aprons posed in kitchen next to table full of plates with food)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon

Human 99.8
Person 99.8
Person 99.7
Person 99.4
Person 99.2
Person 98.7
Home Decor 95.4
Clothing 90.6
Apparel 90.6
Furniture 87
Meal 71
Food 71
Female 70.3
Dish 69
Linen 67.9
Face 67.2
Shelf 67
People 62
Sitting 60
Senior Citizen 59.4
Couch 58
Text 57.2
Indoors 55.8
Table 55.6
Screen 55.6
Electronics 55.6
Woman 55.2

Clarifai

people 99.8
group 98.7
group together 97.4
adult 96.8
man 96.8
woman 95.8
several 90.4
administration 89
three 87.1
music 86.9
monochrome 86
furniture 84.6
room 83.5
indoors 83.2
two 82.7
many 81.8
four 81.7
musician 81.3
five 80.1
wear 78

Imagga

barbershop 55.9
shop 50.7
mercantile establishment 36.1
room 26.7
home 26.3
people 26.2
interior 25.6
person 24.7
place of business 24.5
indoors 23.7
chair 22.5
man 22.2
male 19.8
men 18.9
kitchen 18
table 17.3
adult 17.3
modern 16.8
women 16.6
business 16.4
house 15.9
salon 15.9
teacher 15.6
counter 15.4
restaurant 14.8
inside 14.7
indoor 14.6
office 14.5
happy 14.4
smiling 13.7
lifestyle 13.7
businessman 12.3
establishment 12.2
floor 12.1
classroom 11.5
cheerful 11.4
furniture 11.3
couple 11.3
sitting 11.2
horizontal 10.9
holding 10.7
food 10.3
design 10.1
fun 9.7
group 9.7
computer 9.6
light 9.3
casual 9.3
dinner 9.3
drink 9.2
back 9.2
standing 8.7
happiness 8.6
smile 8.5
togetherness 8.5
mature 8.4
life 8.3
leisure 8.3
fashion 8.3
occupation 8.2
human 8.2
laptop 8.2
style 8.1
meal 8.1
handsome 8
employee 7.9
black 7.8
education 7.8
glass 7.8
party 7.7
blackboard 7.7
luxury 7.7
class 7.7
comfortable 7.6
two 7.6
elegance 7.5
musical instrument 7.4
coffee 7.4
educator 7.3
waiter 7.3
board 7.2
professional 7.2
looking 7.2
family 7.1
night 7.1
face 7.1

Google

Microsoft

person 96.7
black and white 90.8
family 28.1
food 20.9
monochrome 19.3

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.4%
Angry 45.7%
Happy 46.8%
Sad 47%
Confused 45.5%
Disgusted 46.2%
Calm 48.2%
Surprised 45.6%

AWS Rekognition

Age 4-9
Gender Female, 51.5%
Angry 46.5%
Disgusted 45.8%
Surprised 46.2%
Sad 46.1%
Confused 45.5%
Calm 46.5%
Happy 48.5%

AWS Rekognition

Age 38-57
Gender Male, 54.4%
Sad 45.3%
Angry 45.3%
Surprised 45.5%
Happy 46.9%
Calm 51.5%
Confused 45.2%
Disgusted 45.3%

AWS Rekognition

Age 27-44
Gender Male, 54.8%
Happy 45.9%
Calm 51.9%
Sad 46.1%
Angry 45.4%
Surprised 45.2%
Disgusted 45.3%
Confused 45.1%

AWS Rekognition

Age 45-65
Gender Male, 50.5%
Sad 45.4%
Calm 51.9%
Happy 45.5%
Surprised 45.6%
Disgusted 45.8%
Confused 45.4%
Angry 45.3%

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a group of people standing in front of a store 85.5%
a group of people standing in a room 85.4%
a group of people in front of a store 81.9%

Text analysis

Amazon

KODIK
2VEE1A