Human Generated Data

Title

Untitled (group of students posed around table watching science experiment in classrom)

Date

1949-1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Human Generated Data

Title

Untitled (group of students posed around table watching science experiment in classrom)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949-1950

Classification

Photographs

Machine Generated Data

Tags

Amazon

Person 99.5
Human 99.5
Person 99.4
Person 99.1
Person 98.7
Person 98
Restaurant 98
Meal 95.3
Food 95.3
Person 95.2
Person 95.1
Person 91.8
Cafeteria 88.1
Worker 72.5
Accessories 69.3
Sunglasses 69.3
Accessory 69.3
Cafe 61.6
Clothing 61.4
Apparel 61.4
Dish 57.7
Food Court 56.2

Clarifai

people 99.8
group 99
group together 98.5
adult 96.7
man 96.5
furniture 95.2
many 94.9
several 94.6
woman 93.8
room 91.8
leader 89.9
chair 89
administration 87.3
two 84.1
indoors 83.5
wear 83.2
vehicle 82.6
five 81.4
three 81.3
four 81

Imagga

counter 50.4
shop 36.5
city 29.9
building 25.1
mercantile establishment 24.8
people 23.4
architecture 22.6
urban 21.8
barbershop 20.6
place of business 16.3
business 15.8
old 15.3
plaza 14.6
house 14.2
man 14.1
street 13.8
men 12.9
travel 12.7
town 12
window 12
historic 11.9
office 11.7
history 11.6
tourism 11.5
life 11.2
women 11.1
restaurant 10.7
male 10.6
couple 10.4
religion 9.9
adult 9.8
indoors 9.7
love 9.5
culture 9.4
horizontal 9.2
sky 8.9
light 8.7
scene 8.6
work 8.6
boutique 8.4
famous 8.4
church 8.3
inside 8.3
dress 8.1
landmark 8.1
home 8
interior 8
lifestyle 7.9
hall 7.9
establishment 7.9
station 7.8
black 7.8
ancient 7.8
glass 7.8
ceremony 7.8
modern 7.7
crowd 7.7
stone 7.6
chair 7.5
historical 7.5
buy 7.5
color 7.2
transportation 7.2
structure 7.1
case 7

Microsoft

person 97.4
black and white 51.6
shelf 51.6
lego 47.8
monochrome 14.8

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Surprised 45.9%
Angry 48.1%
Disgusted 45.3%
Happy 45.2%
Sad 46%
Confused 45.8%
Calm 48.7%

AWS Rekognition

Age 19-36
Gender Male, 54.5%
Sad 49.4%
Confused 45.4%
Happy 45.2%
Angry 45.4%
Calm 49.2%
Disgusted 45.2%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Surprised 45.2%
Angry 45.4%
Disgusted 45.1%
Happy 45.1%
Sad 46.8%
Confused 45.3%
Calm 52.3%

AWS Rekognition

Age 30-47
Gender Female, 50.2%
Disgusted 45.1%
Surprised 45.2%
Confused 45.2%
Calm 52.9%
Happy 45.1%
Angry 45.2%
Sad 46.3%

AWS Rekognition

Age 35-52
Gender Male, 52.5%
Happy 45.3%
Calm 49.9%
Sad 48.8%
Angry 45.3%
Surprised 45.3%
Disgusted 45.1%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Happy 45%
Disgusted 45%
Sad 45%
Angry 45%
Confused 45%
Surprised 45%
Calm 54.9%

Feature analysis

Amazon

Person 99.5%
Sunglasses 69.3%

Captions

Microsoft

a group of people in a room 81.2%
a group of people standing in front of a building 64.1%
a group of people standing in a room 64%

Text analysis

Amazon

AVIELA
Iea