Human Generated Data

Title

Untitled (business men at meeting)

Date

c.1940

People

Artist: Harris & Ewing, American 1910s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22321

Human Generated Data

Title

Untitled (business men at meeting)

People

Artist: Harris & Ewing, American 1910s-1940s

Date

c.1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.22321

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.5
Human 99.5
Person 99.4
Person 99.2
Person 98.5
Person 98.4
Person 97.9
Person 97.8
Person 97.5
Person 96.3
Person 95.6
Person 95.5
Crowd 94.5
Person 94.1
Room 85.8
Indoors 85.8
Suit 74.6
Clothing 74.6
Coat 74.6
Overcoat 74.6
Apparel 74.6
Suit 74
Press Conference 66
People 64
Person 59.5

Clarifai
created on 2023-10-22

people 98.9
woman 96.7
monochrome 96
girl 94.8
man 93
group 93
music 92.4
group together 91.9
audience 91.7
indoors 90.4
room 89.8
art 87.4
public show 86.6
leader 86.4
boy 85.6
chair 85.4
adult 85.1
portrait 85
child 84.9
street 84.2

Imagga
created on 2022-03-11

projector 62.4
device 61.4
radiator 50.8
optical instrument 46.3
mechanism 44.4
home 24.7
room 24.1
interior 23.9
modern 23.1
house 20
grille 19.4
architecture 18.8
design 18
wall 17.6
metal 16.9
light 16.7
style 16.3
furniture 15.4
grate 15
floor 14.9
indoors 14
wood 13.3
classic 13
steel 12.4
window 12.1
lamp 12.1
black 12
luxury 12
air conditioner 11.9
technology 11.9
decor 11.5
structure 11.3
digital 11.3
table 11.2
old 11.1
equipment 11
domestic 10.8
3d 10.8
barrier 10.5
building 10.4
contemporary 10.3
business 10.3
decoration 10.1
vintage 9.9
kitchen 9.8
texture 9.7
detail 9.6
apartment 9.6
cooling system 9.5
art 9.2
new 8.9
residential 8.6
chrome 8.5
inside 8.3
headlight 8.2
retro 8.2
office 8
cabinet 7.9
empty 7.9
antique 7.8
space 7.7
industry 7.7
comfortable 7.6
electronics 7.6
elegance 7.6
power 7.5
pattern 7.5
iron 7.5
car 7.4
metallic 7.4
object 7.3
music 7.2
shiny 7.1
obstruction 7.1

Google
created on 2022-03-11

White 92.2
Black 89.9
Black-and-white 86.8
Lighting 86.7
Style 84.1
Line 81.8
Monochrome 75.7
Monochrome photography 75.4
Event 73.9
Ceiling 73.7
Building 69.8
Curtain 69.2
Formal wear 66.4
Stock photography 66.4
Room 65.7
Rectangle 60.7
Font 57.5
Crowd 57
Suit 56.3
Vintage clothing 55

Microsoft
created on 2022-03-11

person 97.5
indoor 88.5
text 78
man 73.8
people 60.9
clothing 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Female, 89.5%
Calm 98.5%
Angry 0.6%
Sad 0.3%
Happy 0.3%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 94.5%
Calm 97.7%
Happy 0.9%
Sad 0.8%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 50-58
Gender Male, 98.9%
Calm 90.6%
Sad 5.2%
Angry 2.7%
Surprised 0.4%
Confused 0.4%
Happy 0.3%
Disgusted 0.3%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Male, 81.4%
Calm 94.1%
Sad 5.2%
Fear 0.2%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0.1%

AWS Rekognition

Age 25-35
Gender Male, 98%
Sad 77.3%
Calm 9.2%
Happy 6.4%
Confused 2.1%
Fear 1.7%
Angry 1.2%
Disgusted 1.1%
Surprised 1%

AWS Rekognition

Age 22-30
Gender Male, 98.4%
Sad 96.6%
Calm 1.1%
Happy 0.7%
Confused 0.6%
Disgusted 0.4%
Angry 0.3%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 6-16
Gender Male, 83.6%
Happy 50.4%
Calm 39.4%
Confused 4.6%
Sad 3.2%
Angry 1%
Surprised 0.7%
Disgusted 0.4%
Fear 0.4%

AWS Rekognition

Age 19-27
Gender Male, 94.6%
Sad 99.2%
Confused 0.3%
Calm 0.2%
Disgusted 0.1%
Happy 0.1%
Angry 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 36-44
Gender Male, 91.8%
Happy 38.9%
Calm 27%
Disgusted 19.3%
Angry 6%
Sad 3.3%
Confused 3.2%
Fear 1.2%
Surprised 1.1%

AWS Rekognition

Age 48-54
Gender Male, 98.8%
Surprised 31.5%
Happy 28.1%
Angry 13.6%
Fear 13.2%
Confused 4.2%
Sad 3.8%
Calm 3.4%
Disgusted 2.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Suit
Person 99.5%
Person 99.4%
Person 99.2%
Person 98.5%
Person 98.4%
Person 97.9%
Person 97.8%
Person 97.5%
Person 96.3%
Person 95.6%
Person 95.5%
Person 94.1%
Person 59.5%
Suit 74.6%
Suit 74%

Categories

Imagga

interior objects 99.9%

Text analysis

Amazon

i
KODYK-200EE1A