Human Generated Data

Title

Untitled (audience seen from stage at Armo Mills Poultry Meeting)

Date

1950

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2631

Human Generated Data

Title

Untitled (audience seen from stage at Armo Mills Poultry Meeting)

People

Artist: Harry Annas, American 1897 - 1980

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.2631

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 97.9
Indoors 93.4
Interior Design 93.4
Person 92.5
Person 87.3
Clothing 81.4
Apparel 81.4
Person 81.2
Meal 79.5
Food 79.5
Person 76.2
Room 75.3
Person 74.8
Restaurant 72.9
Floor 70.8
Person 67.4
People 65.8
Cafeteria 65.6
Face 64.9
Suit 64.2
Coat 64.2
Overcoat 64.2
Silhouette 63.5
Person 62
Plant 61.8
Person 61.6
Crowd 60.2
LCD Screen 58.4
Monitor 58.4
Display 58.4
Electronics 58.4
Screen 58.4
Photography 56.4
Photo 56.4
Flooring 56.1
Person 49.9
Person 47.9
Person 43

Clarifai
created on 2023-10-26

monochrome 99.7
people 99.3
indoors 98.8
room 97.1
woman 95.3
window 94.8
man 93.5
group 92.2
furniture 91.8
child 91.5
street 90.7
chair 89.8
adult 88.3
wedding 85.9
architecture 85.8
girl 85.6
seat 85.4
portrait 85.1
museum 84.5
inside 84.1

Imagga
created on 2022-01-15

window 41.1
building 41
architecture 34.6
door 33.9
structure 29.5
house 28.5
sliding door 26.7
home 26.3
interior 24.8
room 19.9
balcony 19.4
windows 19.2
old 16.7
movable barrier 16.4
wall 16.4
modern 16.1
decor 15
design 14.6
greenhouse 14.6
decoration 14.5
window screen 14.1
style 14.1
residential 13.4
light 13.4
office 13.1
framework 12.7
indoors 12.3
barrier 12.1
screen 12
indoor 11.9
city 11.6
furniture 11.5
urban 11.4
construction 11.1
shop 11.1
inside 11
glass 10.9
hall 10.7
chair 10.6
travel 10.6
protective covering 10.2
table 9.7
living 9.5
luxury 9.4
historic 9.2
facade 9
supporting structure 8.9
residence 8.8
ancient 8.6
hotel 8.6
roof 8.6
flower 8.5
floor 8.4
exterior 8.3
lamp 7.8
chandelier 7.8
column 7.7
village 7.7
garden 7.6
frame 7.6
covering 7.6
estate 7.6
real 7.6
stone 7.6
brick 7.5
traditional 7.5
tourism 7.4
town 7.4
retro 7.4
gate 7.3
new 7.3
business 7.3
people 7.2
dirty 7.2
open 7.2
wooden 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

black and white 93.6
window 89.1
art 67.1
building 59.6
house 51.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 96.2%
Sad 80.7%
Calm 17.5%
Confused 0.7%
Happy 0.3%
Angry 0.3%
Disgusted 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 36-44
Gender Male, 97.8%
Calm 63%
Sad 24.1%
Confused 4.3%
Fear 3.9%
Surprised 2.4%
Disgusted 0.8%
Happy 0.8%
Angry 0.7%

AWS Rekognition

Age 23-33
Gender Male, 71.3%
Sad 52.1%
Calm 34.6%
Confused 8.8%
Fear 2.1%
Angry 0.7%
Disgusted 0.7%
Happy 0.6%
Surprised 0.4%

AWS Rekognition

Age 37-45
Gender Female, 64.7%
Confused 23.5%
Disgusted 21.7%
Sad 19.7%
Calm 18%
Surprised 9.4%
Fear 4.3%
Happy 2.3%
Angry 1.1%

AWS Rekognition

Age 38-46
Gender Male, 93.6%
Calm 99.4%
Sad 0.4%
Confused 0.1%
Happy 0%
Angry 0%
Surprised 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 14-22
Gender Female, 98.9%
Calm 90%
Sad 5.9%
Confused 1.6%
Happy 1%
Angry 0.6%
Surprised 0.5%
Fear 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Captions

Text analysis

Amazon

OIL
DE