Human Generated Data

Title

Untitled (group of students posed around table watching science experiment in classrom)

Date

1949-1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6609

Human Generated Data

Title

Untitled (group of students posed around table watching science experiment in classrom)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949-1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6609

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Human 99.5
Person 99.5
Person 99.4
Person 99.1
Person 98.7
Person 98
Restaurant 98
Meal 95.3
Food 95.3
Person 95.2
Person 95.1
Person 91.8
Cafeteria 88.1
Worker 72.5
Accessory 69.3
Accessories 69.3
Sunglasses 69.3
Cafe 61.6
Clothing 61.4
Apparel 61.4
Dish 57.7
Food Court 56.2

Clarifai
created on 2019-03-26

people 99.8
group 99
group together 98.5
adult 96.7
man 96.5
furniture 95.2
many 94.9
several 94.6
woman 93.8
room 91.8
leader 89.9
chair 89
administration 87.3
two 84.1
indoors 83.5
wear 83.2
vehicle 82.6
five 81.4
three 81.3
four 81

Imagga
created on 2019-03-26

counter 50.4
shop 36.5
city 29.9
building 25.1
mercantile establishment 24.8
people 23.4
architecture 22.6
urban 21.8
barbershop 20.6
place of business 16.3
business 15.8
old 15.3
plaza 14.6
house 14.2
man 14.1
street 13.8
men 12.9
travel 12.7
town 12
window 12
historic 11.9
office 11.7
history 11.6
tourism 11.5
life 11.2
women 11.1
restaurant 10.7
male 10.6
couple 10.4
religion 9.9
adult 9.8
indoors 9.7
love 9.5
culture 9.4
horizontal 9.2
sky 8.9
light 8.7
scene 8.6
work 8.6
boutique 8.4
famous 8.4
church 8.3
inside 8.3
dress 8.1
landmark 8.1
home 8
interior 8
lifestyle 7.9
hall 7.9
establishment 7.9
station 7.8
black 7.8
ancient 7.8
glass 7.8
ceremony 7.8
modern 7.7
crowd 7.7
stone 7.6
chair 7.5
historical 7.5
buy 7.5
color 7.2
transportation 7.2
structure 7.1
case 7

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

person 97.4
shelf 51.6
black and white 51.6
lego 47.8
monochrome 14.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.2%
Surprised 45.9%
Angry 48.1%
Disgusted 45.3%
Happy 45.2%
Sad 46%
Confused 45.8%
Calm 48.7%

AWS Rekognition

Age 19-36
Gender Male, 54.5%
Sad 49.4%
Confused 45.4%
Happy 45.2%
Angry 45.4%
Calm 49.2%
Disgusted 45.2%
Surprised 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Surprised 45.2%
Angry 45.4%
Disgusted 45.1%
Happy 45.1%
Sad 46.8%
Confused 45.3%
Calm 52.3%

AWS Rekognition

Age 30-47
Gender Female, 50.2%
Disgusted 45.1%
Surprised 45.2%
Confused 45.2%
Calm 52.9%
Happy 45.1%
Angry 45.2%
Sad 46.3%

AWS Rekognition

Age 35-52
Gender Male, 52.5%
Happy 45.3%
Calm 49.9%
Sad 48.8%
Angry 45.3%
Surprised 45.3%
Disgusted 45.1%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Happy 45%
Disgusted 45%
Sad 45%
Angry 45%
Confused 45%
Surprised 45%
Calm 54.9%

Feature analysis

Amazon

Person 99.5%
Sunglasses 69.3%

Categories

Text analysis

Amazon

AVIELA
Iea