Human Generated Data

Title

Untitled (man looking out drive-in wondow)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16666

Human Generated Data

Title

Untitled (man looking out drive-in wondow)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16666

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.5
Human 98.5
Furniture 97.4
Person 94.9
Home Decor 81
Clothing 67.2
Apparel 67.2
Face 61.7
Drawer 59.7

Clarifai
created on 2023-10-29

monochrome 99.8
people 98.9
window 95.3
indoors 95.1
man 93.8
wedding 93.2
one 92.3
woman 92.1
two 91.7
door 91.7
vehicle window 90.5
adult 90.5
street 90.4
black and white 89
vehicle 88.6
square 87.6
mirror 85.9
model 83.9
glass 81.7
light 77.3

Imagga
created on 2022-02-26

home appliance 54.3
refrigerator 52.8
microwave 43.9
appliance 41.9
white goods 37.4
kitchen appliance 34.4
furniture 33.7
interior 33.6
home 33.5
room 27.2
modern 25.2
house 24.2
cabinet 22
monitor 20.8
architecture 18.7
durables 18.2
luxury 17.1
design 16.9
inside 16.6
medicine chest 16.4
screen 16.3
wall 16.2
door 16.1
floor 15.8
machine 15.6
3d 15.5
window 14.9
technology 14.8
business 14.6
apartment 14.4
open 14.4
indoor 13.7
light 13.4
clean 13.4
equipment 12.9
metal 12.9
office 12.8
kitchen 12.8
indoors 12.3
new 12.1
furnishing 12
glass 11.7
space 10.9
computer 10.8
bathroom 10.5
empty 10.3
banking 10.1
nobody 10.1
domestic 9.9
bank 9.8
building 9.5
contemporary 9.4
money 9.4
safe 9.3
wood 9.2
gray 9
fridge 8.9
steel 8.8
looking 8.8
lifestyle 8.7
render 8.6
work 8.6
residential 8.6
expensive 8.6
estate 8.5
printer 8.5
finance 8.4
electronic 8.4
frame 8.3
file 8.3
cash 8.2
style 8.2
currency 8.1
object 8.1
success 8
decor 8
doorway 7.9
life 7.8
luxurious 7.8
device 7.8
construction 7.7
sky 7.6
storage 7.6
living 7.6
keyboard 7.5
panel 7.4
silver 7.1

Microsoft
created on 2022-02-26

text 98.7
vehicle 81.3
clothing 77
person 74.8
car 73.5
black and white 67
white 64.3
man 58
land vehicle 51.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Male, 96%
Fear 57.2%
Calm 15.6%
Sad 12.3%
Happy 6.8%
Angry 2.5%
Confused 2.4%
Disgusted 2%
Surprised 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.5%
Person 94.9%

Categories

Imagga

interior objects 100%

Text analysis

Google

YAGON
YAGON