Human Generated Data

Title

Untitled (portraits of men on wall, advertising executives)

Date

1951

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20126

Human Generated Data

Title

Untitled (portraits of men on wall, advertising executives)

People

Artist: Peter James Studio, American

Date

1951

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20126

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Furniture 99.1
Couch 92.7
Person 91.2
Human 91.2
Room 90.2
Indoors 90.2
Living Room 90.1
Person 83.7
Person 82.2
Person 81.9
Person 75
Person 74.6
Person 73.5
Person 72.4
Advertisement 72.3
Person 71.9
Person 71.7
Person 71.1
Person 70.8
Person 70.3
Poster 70.1
Table 66.3
Text 65.8
Paper 61.9
Person 60.5
Wall 59.8
Bedroom 59.7
Dorm Room 57.1
Person 56.7
Flyer 55.3
Brochure 55.3
Person 49.7
Person 42.1

Clarifai
created on 2023-10-22

no person 98.1
indoors 96.8
room 93.8
architecture 92.9
furniture 91.4
shelf 91
family 89.6
illustration 88.1
inside 87.7
house 87.7
business 87.3
wall 86.1
contemporary 85.9
picture frame 84.9
paper 84.7
window 84.4
design 83.5
monochrome 83.2
retro 82.3
empty 80.7

Imagga
created on 2022-03-05

interior 34.5
room 31
digital clock 30.3
modern 28.1
keyboard 26.8
clock 26
furniture 25.8
home 24.1
safe 22.5
scoreboard 22.1
design 21.4
board 21.3
case 21.1
3d 19.4
computer 19.3
technology 19.3
wall 19
timepiece 18.9
key 18.8
button 18.5
signboard 16.2
floor 15.8
box 15.7
device 15.6
office 15.4
strongbox 15.4
business 15.2
contemporary 15.1
living 14.2
sofa 13.9
empty 12.9
architecture 12.6
equipment 12.6
house 12.5
object 12.5
apartment 12.5
lamp 12.4
render 12.1
vase 11.6
table 11.4
style 11.1
domestic 10.9
lifestyle 10.8
instrument 10.7
rendering 10.5
door 10.5
bank 10.4
machine 10.3
number 10.3
finance 10.1
light 10
frame 10
carpet 9.7
digital 9.7
container 9.7
couch 9.7
comfort 9.7
comfortable 9.6
showing 9.4
space 9.3
communication 9.2
inside 9.2
data 9.1
sign 9
calculator 8.9
information 8.9
indoors 8.8
keypad 8.8
buttons 8.5
word 8.5
electronic 8.4
window 8.2
financial 8
icon 7.9
work 7.9
glass 7.8
money 7.7
web 7.6
display 7.4
banking 7.4
new 7.3
gray 7.2

Google
created on 2022-03-05

Furniture 94
Couch 90.9
Product 90.7
Rectangle 90.6
Black 89.8
Shelf 88.4
Picture frame 87.4
Interior design 86.1
Grey 84.2
Shelving 83
Line 82.5
Font 82.5
Wall 82.5
Wood 81.3
Art 79.2
Living room 76.4
Tints and shades 75.9
Wall sticker 72
Comfort 71.3
studio couch 71.2

Microsoft
created on 2022-03-05

text 98.3
black and white 97
indoor 89.5
monochrome 79.6
picture frame 74.4
art 55.4
drawing 53.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Male, 80.7%
Sad 45%
Happy 31.1%
Angry 8.2%
Confused 5%
Calm 4.5%
Surprised 3.1%
Fear 1.7%
Disgusted 1.3%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Calm 87%
Happy 4%
Fear 3.5%
Sad 2.2%
Surprised 1.4%
Angry 0.8%
Disgusted 0.6%
Confused 0.4%

AWS Rekognition

Age 23-33
Gender Female, 67.5%
Calm 35.3%
Sad 25.1%
Angry 16.3%
Happy 9.9%
Fear 8.2%
Confused 2.3%
Disgusted 2.2%
Surprised 0.7%

AWS Rekognition

Age 34-42
Gender Male, 99.5%
Fear 45.1%
Calm 30.9%
Happy 13.1%
Sad 4.5%
Angry 3.4%
Confused 1.2%
Surprised 1%
Disgusted 0.8%

AWS Rekognition

Age 24-34
Gender Male, 83.6%
Sad 51.1%
Calm 22.5%
Angry 11.3%
Surprised 4.6%
Happy 3.1%
Fear 2.9%
Confused 2.8%
Disgusted 1.8%

AWS Rekognition

Age 31-41
Gender Female, 63.1%
Calm 45.3%
Sad 38.8%
Disgusted 4.1%
Surprised 3.4%
Happy 2.9%
Angry 2.8%
Fear 1.7%
Confused 1.1%

AWS Rekognition

Age 19-27
Gender Female, 92.2%
Happy 85%
Calm 9.6%
Fear 2.2%
Sad 1.6%
Disgusted 0.4%
Confused 0.4%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 21-29
Gender Female, 87.4%
Calm 79%
Sad 8.9%
Confused 4.5%
Happy 3.8%
Angry 1.2%
Fear 1.1%
Surprised 1%
Disgusted 0.5%

AWS Rekognition

Age 23-33
Gender Male, 98.6%
Happy 36.7%
Calm 35.4%
Sad 15.2%
Fear 8.1%
Confused 2.2%
Disgusted 0.9%
Angry 0.9%
Surprised 0.6%

AWS Rekognition

Age 33-41
Gender Male, 54%
Fear 28.5%
Calm 27.3%
Happy 17.6%
Angry 8.6%
Confused 7.1%
Sad 4.9%
Surprised 3.6%
Disgusted 2.4%

AWS Rekognition

Age 28-38
Gender Male, 99.9%
Calm 26.3%
Sad 23.9%
Confused 20.2%
Angry 10.2%
Happy 8%
Disgusted 6.4%
Surprised 2.7%
Fear 2.4%

AWS Rekognition

Age 33-41
Gender Male, 99.4%
Calm 74.8%
Surprised 8.4%
Disgusted 7%
Sad 2.8%
Angry 2.4%
Happy 1.8%
Fear 1.6%
Confused 1%

AWS Rekognition

Age 26-36
Gender Female, 94.1%
Calm 70.4%
Sad 23.4%
Surprised 1.9%
Fear 1.3%
Disgusted 1.2%
Confused 0.9%
Happy 0.5%
Angry 0.5%

AWS Rekognition

Age 40-48
Gender Male, 83%
Happy 97.3%
Calm 1.4%
Disgusted 0.4%
Surprised 0.4%
Confused 0.2%
Sad 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 37-45
Gender Male, 99%
Calm 54.8%
Sad 12.7%
Angry 9.8%
Surprised 7.1%
Confused 5.5%
Disgusted 5.1%
Happy 2.8%
Fear 2.2%

AWS Rekognition

Age 26-36
Gender Male, 100%
Calm 68.6%
Sad 15.8%
Confused 7.9%
Happy 2.1%
Disgusted 1.8%
Angry 1.6%
Surprised 1.4%
Fear 0.8%

AWS Rekognition

Age 14-22
Gender Female, 53.7%
Calm 99.6%
Sad 0.2%
Happy 0.1%
Angry 0%
Fear 0%
Confused 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 26-36
Gender Male, 98.8%
Angry 26.1%
Calm 23.5%
Happy 18.8%
Sad 10.1%
Confused 8.1%
Disgusted 4.7%
Surprised 4.6%
Fear 4.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Couch
Person
Couch 92.7%
Person 91.2%
Person 83.7%
Person 82.2%
Person 81.9%
Person 75%
Person 74.6%
Person 73.5%
Person 72.4%
Person 71.9%
Person 71.7%
Person 71.1%
Person 70.8%
Person 70.3%
Person 60.5%
Person 56.7%
Person 49.7%
Person 42.1%

Categories

Imagga

interior objects 100%

Captions

Microsoft
created on 2022-03-05

a display in a room 80.2%
a photo of a person 60%

Text analysis

Amazon

NA
EVEETA
emer
Vagoy