Human Generated Data

Title

Untitled (children and nun posed standing around large school kitchen)

Date

1950-1955

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6581

Human Generated Data

Title

Untitled (children and nun posed standing around large school kitchen)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950-1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6581

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Person 99.7
Human 99.7
Restaurant 98.8
Cafeteria 96.8
Person 96
Person 94.6
Person 90
Person 86.3
Person 80.9
Meal 77.5
Food 77.5
Cafe 65.8
Floor 63.8
Flooring 59.5
Indoors 59.4
Lab 56.8
Person 44.2

Clarifai
created on 2019-03-26

people 99.4
furniture 98
monochrome 97.8
indoors 95.8
room 94.9
group 94.4
group together 93.2
adult 93.1
woman 91.6
man 90.8
chair 90.1
stock 89.9
seat 86.1
two 83
many 82.3
several 81.1
hospital 81.1
street 79.1
exhibition 78.4
restaurant 78

Imagga
created on 2019-03-26

counter 70.6
interior 61.9
kitchen 42.1
shop 38.6
modern 37.9
room 36.2
barbershop 34.3
house 34.3
furniture 33.8
home 33.5
luxury 27.4
decor 27.4
mercantile establishment 26.4
table 26.3
chair 26.2
architecture 25.8
stove 25.2
steel 24.8
indoor 24.7
indoors 23.7
window 23.4
design 22.5
inside 22.1
apartment 21.1
floor 20.5
oven 19.8
light 19.4
wood 19.2
dishwasher 18.5
appliance 18.4
cabinet 18
place of business 17.5
stainless 17.5
white goods 16.8
glass 16.3
lamp 16.2
new 16.2
building 16
sink 15.8
home appliance 15.6
restaurant 15.5
residential 15.3
comfortable 15.3
contemporary 15
cooking 14.8
expensive 14.4
office 14.3
business 14
clean 13.4
3d 13.2
wall 12.8
tile 12.5
dining 12.4
urban 12.2
cook 11.9
city 11.6
decoration 11.6
microwave 11.1
island 11
nobody 10.9
food 10.9
domestic 10.8
faucet 10.8
estate 10.4
living 10.4
elegant 10.3
equipment 10.3
refrigerator 10.2
dinner 10.1
hall 10.1
chairs 9.8
style 9.6
man 9.4
construction 9.4
stylish 9
people 8.9
station 8.9
cabinets 8.9
metal 8.8
mansion 8.8
granite 8.8
real estate 8.8
machine 8.7
hotel 8.6
empty 8.6
mirror 8.6
real 8.5
buy 8.5
elegance 8.4
horizontal 8.4
establishment 8.3
structure 8.1
seat 8
wooden 7.9
remodel 7.9
work 7.8
render 7.8
designer 7.7
lighting 7.7
sofa 7.7
lifestyle 7.2
shelf 7.1
life 7

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

indoor 93.1
floor 91.9
black and white 91.9
kitchen 88.2
food 46.4
exhibition 32.6
person 30.3
monochrome 23

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Sad 49.8%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%
Confused 49.6%
Calm 50.1%
Surprised 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Angry 49.5%
Disgusted 50.4%
Confused 49.5%
Surprised 49.5%
Sad 49.5%
Happy 49.5%
Calm 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.3%
Angry 49.7%
Happy 49.6%
Disgusted 49.7%
Surprised 49.6%
Calm 49.6%
Sad 49.9%
Confused 49.5%

AWS Rekognition

Age 35-53
Gender Male, 52.7%
Confused 45.1%
Angry 45.4%
Surprised 45.2%
Disgusted 45.1%
Calm 51.8%
Happy 45.5%
Sad 46.9%

AWS Rekognition

Age 20-38
Gender Male, 52%
Angry 45.5%
Happy 45.2%
Sad 46.1%
Confused 45.3%
Disgusted 45.1%
Surprised 45.3%
Calm 52.4%

AWS Rekognition

Age 20-38
Gender Female, 50.2%
Disgusted 49.6%
Happy 49.8%
Angry 49.5%
Sad 49.7%
Calm 49.8%
Confused 49.5%
Surprised 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50%
Disgusted 49.9%
Sad 49.7%
Surprised 49.6%
Calm 49.6%
Confused 49.6%
Angry 49.6%
Happy 49.6%

AWS Rekognition

Age 48-68
Gender Female, 50.1%
Confused 49.5%
Happy 49.5%
Angry 49.7%
Disgusted 49.5%
Sad 50.1%
Calm 49.6%
Surprised 49.5%

AWS Rekognition

Age 49-69
Gender Female, 50.4%
Angry 49.6%
Sad 50.1%
Surprised 49.5%
Confused 49.5%
Calm 49.6%
Happy 49.6%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Calm 50%
Confused 49.5%
Sad 49.8%
Angry 49.5%
Happy 49.5%
Surprised 49.5%
Disgusted 49.5%

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

YT33A2