Human Generated Data

Title

Untitled (view of department store interior during sale)

Date

1950

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6267

Human Generated Data

Title

Untitled (view of department store interior during sale)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6267

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 84
Human 84
Pottery 82.7
Person 81.2
Chess 78.1
Game 78.1
Ceiling Fan 73.5
Appliance 73.5
Lighting 71.9
Interior Design 70.6
Indoors 70.6
Art 69.4
Person 66.4
Porcelain 64.8
Person 59.3
Meal 57.5
Food 57.5
Furniture 56.8
Restaurant 55.2
Person 45.9

Clarifai
created on 2023-10-26

people 99.3
group 97.5
monochrome 94
no person 93.5
many 93.5
art 90.9
travel 90.7
street 88.8
design 87
man 86.6
light 86.4
commerce 86.1
city 82.8
watercraft 82.6
vehicle 81.7
furniture 81.3
music 80.2
market 80.1
military 79.2
black and white 78.8

Imagga
created on 2022-01-22

equipment 24.9
electronic equipment 20
mixer 17.8
glass 16.4
technology 16.3
light 16
interior 15
digital 14.6
modern 13.3
table 12.9
furniture 12.5
alcohol 12.1
party 12
design 11.8
wineglass 11.7
3d 11.6
business 11.5
drink 10.8
container 10.7
black 10.3
three dimensional 10.3
stage 10.3
amplifier 10.1
bottle 9.9
wine 9.7
information 9.7
home 9.6
luxury 9.4
beverage 9.3
graphics 9.2
house 9.2
city 9.1
hand 9.1
food 9.1
kitchen 8.9
urban 8.7
counter 8.6
architecture 8.6
menorah 8.5
effects 8.5
indoor 8.2
music 8.1
decoration 8
computer 8
night 8
restaurant 8
machine 7.9
indoors 7.9
structure 7.9
render 7.8
communication 7.6
reflection 7.4
case 7.4
inside 7.4
lamp 7.4
metal 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

black and white 88
text 75.7
water 60.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 56.8%
Sad 67.3%
Calm 21.2%
Confused 4.3%
Happy 2.9%
Disgusted 1.8%
Angry 1.1%
Surprised 0.9%
Fear 0.6%

AWS Rekognition

Age 24-34
Gender Female, 97.7%
Calm 56.8%
Happy 21.3%
Sad 14.4%
Confused 3%
Angry 2.7%
Disgusted 0.8%
Surprised 0.5%
Fear 0.5%

AWS Rekognition

Age 33-41
Gender Female, 97.6%
Happy 89.9%
Calm 2.4%
Surprised 2.3%
Sad 2%
Angry 1.3%
Fear 0.8%
Disgusted 0.7%
Confused 0.6%

AWS Rekognition

Age 21-29
Gender Female, 64.4%
Happy 51.8%
Calm 34.8%
Surprised 4.7%
Angry 2.6%
Sad 2.5%
Disgusted 1.7%
Fear 1.1%
Confused 0.8%

AWS Rekognition

Age 19-27
Gender Male, 74.6%
Sad 72%
Calm 19.1%
Happy 4.9%
Confused 2.5%
Disgusted 0.5%
Angry 0.4%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 18-24
Gender Female, 91.9%
Calm 51.6%
Happy 38.8%
Sad 3.1%
Disgusted 1.6%
Fear 1.5%
Surprised 1.4%
Angry 1.3%
Confused 0.6%

AWS Rekognition

Age 12-20
Gender Male, 98.4%
Calm 65.9%
Disgusted 17.4%
Surprised 5.7%
Sad 3.8%
Angry 3.6%
Happy 2.6%
Confused 0.6%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 83.7%
Calm 86.2%
Surprised 4.1%
Happy 3.1%
Disgusted 2.8%
Sad 1.8%
Angry 1%
Confused 0.5%
Fear 0.5%

AWS Rekognition

Age 14-22
Gender Male, 65.1%
Calm 66.1%
Sad 12.4%
Happy 8.3%
Confused 6.4%
Disgusted 2.9%
Fear 1.5%
Angry 1.2%
Surprised 1%

AWS Rekognition

Age 23-33
Gender Female, 57.1%
Calm 98.9%
Happy 0.3%
Fear 0.3%
Sad 0.2%
Angry 0.2%
Confused 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 19-27
Gender Female, 94.8%
Calm 58.1%
Happy 18%
Sad 12.4%
Angry 6%
Fear 1.8%
Surprised 1.6%
Confused 1.1%
Disgusted 0.9%

Feature analysis

Amazon

Person 84%
Chess 78.1%
Ceiling Fan 73.5%

Categories

Text analysis

Amazon

NO
KODAK
3.50
2.50
-ECCESSORES
REFUNDS
KODAK BYLEEX
L75
1.50
IDCHANGES
DE
BYLEEX
KIA
WAS
TO LEAR

Google

-KCESSONES NO
-KCESSONES
NO