Human Generated Data

Title

View into the Adler Cabriolet, 1930-1933

Date

c. 1930-1933

People

Artist: Walter Gropius, German 1883 - 1969

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.34.2

Human Generated Data

Title

View into the Adler Cabriolet, 1930-1933

People

Artist: Walter Gropius, German 1883 - 1969

Artist: Unidentified Artist,

Date

c. 1930-1933

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Walter Gropius, BRGA.34.2

Machine Generated Data

Tags

Amazon
created on 2022-06-04

Person 96.8
Human 96.8
Monitor 72.4
Electronics 72.4
Screen 72.4
Display 72.4
Person 66.4
Art 63.1

Clarifai
created on 2023-10-31

people 99.6
furniture 99.4
room 97
analogue 96.4
portrait 95.6
no person 95.4
vintage 95.2
family 95.1
art 94.9
adult 94.5
group 93.8
retro 93.4
indoors 92.9
one 92.1
picture frame 91.5
seat 91.1
vehicle 91
two 90.9
painting 90.7
museum 90.5

Imagga
created on 2022-06-04

microwave 97.9
kitchen appliance 87.9
home appliance 77.5
appliance 55.4
television 28.4
monitor 26
durables 25.6
home 23.9
technology 23.7
equipment 22.8
screen 19.8
interior 19.4
room 17.6
house 17.5
modern 17.5
dishwasher 17.3
kitchen 16.4
white goods 16
electronic 14.9
computer 14.7
box 14.5
metal 13.7
steel 13.3
safe 13
digital 13
security 12.8
open 12.6
electronic equipment 12.4
telecommunication system 12.2
business 12.1
old 11.8
design 11.8
stainless 11.6
money 11
oven 10.8
bank 10.7
stove 10.5
broadcasting 10.4
display 10.3
object 10.2
furniture 10.2
finance 10.1
safety 10.1
indoor 10
wood 10
protection 10
silver 9.7
new 9.7
camera 9.6
toaster 9.6
empty 9.4
wall 9.4
space 9.3
communication 9.2
inside 9.2
cash 9.1
working 8.8
light 8.7
machine 8.7
decoration 8.7
lock 8.6
luxury 8.6
storage 8.6
estate 8.5
3d 8.5
black 8.4
clean 8.3
service 8.3
banking 8.3
window 8.2
style 8.2
indoors 7.9
architecture 7.8
deposit 7.8
nobody 7.8
blank 7.7
apartment 7.7
media 7.6
old fashioned 7.6
keyboard 7.5
floor 7.4
retro 7.4
domestic 7.2
wealth 7.2
copy 7.1
work 7.1
sink 7

Microsoft
created on 2022-06-04

wall 99.3
indoor 98.9
gallery 98.2
text 92
scene 91.4
room 89.5
land vehicle 87.8
vehicle 87.6
black and white 85.2
black 82.1
white 74.5
car 67.5
picture frame 39.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 60.6%
Calm 95.1%
Surprised 6.9%
Fear 5.9%
Confused 2.9%
Sad 2.2%
Disgusted 0.2%
Angry 0.1%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%
Monitor 72.4%

Categories

Imagga

interior objects 99.8%

Captions

Text analysis

Amazon

34.2

Google

34.2